International audienceWe give a general recipe for derandomising PAC-Bayesian bounds using margins, with the critical ingredient being that our randomised predictions concentrate around some value. The tools we develop traightforwardly lead to margin bounds for various classifiers, including linear prediction—a class that includes boosting and the support vector machine—single-hidden-layer neural networks with an unusual erf activation function, and deep ReLU networks. Further, we extend to partially-derandomised predictors where only some of the randomness is removed, letting us extend bounds to cases where the concentration properties of our predictors are otherwise poor
A number of results have bounded generalization of a classier in terms of its margin on the training...
Reweighting adversarial data during training has been recently shown to improve adversarial robustne...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...
We give a general recipe for derandomising PAC-Bayesian bounds using margins, with the critical ingr...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
International audiencePAC-Bayesian bounds are known to be tight and informative when studying the ge...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...
We establish a disintegrated PAC-Bayesian bound, for classifiers that are trained via continuous-tim...
We present a bound on the generalisation error of linear classifiers in terms of a refined margin qu...
A number of results have bounded generalization of a classier in terms of its margin on the training...
International audienceA learning method is self-certified if it uses all available data to simultane...
Anumber of results have bounded generalization of a classi er in terms of its margin on the training...
We make three related contributions motivated by the challenge of training stochastic neural network...
Typical bounds on generalization of Support Vector Machines are based on the minimum distance betwee...
A number of results have bounded generalization of a classifier in terms of its margin on the traini...
A number of results have bounded generalization of a classier in terms of its margin on the training...
Reweighting adversarial data during training has been recently shown to improve adversarial robustne...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...
We give a general recipe for derandomising PAC-Bayesian bounds using margins, with the critical ingr...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
International audiencePAC-Bayesian bounds are known to be tight and informative when studying the ge...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...
We establish a disintegrated PAC-Bayesian bound, for classifiers that are trained via continuous-tim...
We present a bound on the generalisation error of linear classifiers in terms of a refined margin qu...
A number of results have bounded generalization of a classier in terms of its margin on the training...
International audienceA learning method is self-certified if it uses all available data to simultane...
Anumber of results have bounded generalization of a classi er in terms of its margin on the training...
We make three related contributions motivated by the challenge of training stochastic neural network...
Typical bounds on generalization of Support Vector Machines are based on the minimum distance betwee...
A number of results have bounded generalization of a classifier in terms of its margin on the traini...
A number of results have bounded generalization of a classier in terms of its margin on the training...
Reweighting adversarial data during training has been recently shown to improve adversarial robustne...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...