htmlabstractWe present a novel notion of complexity that interpolates between and generalizes some classic existing complexity notions in learning theory: for estimators like empirical risk minimization (ERM) with arbitrary bounded losses, it is upper bounded in terms of data-independent Rademacher complexity; for generalized Bayesian estimators, it is upper bounded by the data-dependent information complexity (also known as stochastic or PAC-Bayesian, KL(posterior∥prior) complexity. For (penalized) ERM, the new complexity reduces to (generalized) normalized maximum likelihood (NML) complexity, i.e. a minimax log-loss individual-sequence regret. Our first main result bounds excess risk in terms of the new complexity. Our second main result ...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We narrow the width of the confidence interval introduced by Vapnik and Chervonenkis for the risk fu...
We present a new PAC-Bayesian generalization bound. Standard bounds contain a $\sqrt{L_n \cdot \KL/n...
We present a novel notion of complexity that interpolates between and generalizes some classic exist...
We present a novel notion of complexity that interpolates between and generalizes some classic exist...
The Structural Risk Minimization principle allows estimating the generalization ability of a learned...
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), with which we estab...
We present new excess risk bounds for general unbounded loss functions including log loss and square...
This paper deals with the problem of identifying a connection between the Vapnik–Chervonenkis (VC) E...
Two main concepts studied in machine learning theory are generalization gap (difference between trai...
This paper constructs bounds on the minimax risk under loss functions when statistical estimation is...
This work characterizes the generalization ability of algorithms whose predictions are linear in the...
International audienceOne of the main open problems in the theory of multi-category margin classific...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We narrow the width of the confidence interval introduced by Vapnik and Chervonenkis for the risk fu...
We present a new PAC-Bayesian generalization bound. Standard bounds contain a $\sqrt{L_n \cdot \KL/n...
We present a novel notion of complexity that interpolates between and generalizes some classic exist...
We present a novel notion of complexity that interpolates between and generalizes some classic exist...
The Structural Risk Minimization principle allows estimating the generalization ability of a learned...
We show a Talagrand-type concentration inequality for Multi-Task Learning (MTL), with which we estab...
We present new excess risk bounds for general unbounded loss functions including log loss and square...
This paper deals with the problem of identifying a connection between the Vapnik–Chervonenkis (VC) E...
Two main concepts studied in machine learning theory are generalization gap (difference between trai...
This paper constructs bounds on the minimax risk under loss functions when statistical estimation is...
This work characterizes the generalization ability of algorithms whose predictions are linear in the...
International audienceOne of the main open problems in the theory of multi-category margin classific...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We propose new bounds on the error of learning algorithms in terms of a data-dependent notion of com...
In this paper we develop a novel probabilistic generalization bound for regular-ized kernel learning...
We narrow the width of the confidence interval introduced by Vapnik and Chervonenkis for the risk fu...
We present a new PAC-Bayesian generalization bound. Standard bounds contain a $\sqrt{L_n \cdot \KL/n...