All authors contributed equally to this work. We propose a PAC-Bayesian analysis of the transductive learning setting, introduced by Vapnik [1998], by proposing a family of new bounds on the generalization error. Some of them are derived from their counterpart in the inductive setting, and others are new. We also compare their behavior.
The common method to understand and improve classification rules is to prove bounds on the generaliz...
International audiencePAC-Bayesian learning bounds are of the utmost interest to the learning commun...
Transfer learning has received a lot of attention in the machine learning community over the last ye...
Inductive learning is based on inferring a general rule from a finite data set and using it to label...
Inductive learning is based on inferring a general rule from a finite data set and using it to labe...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...
Inductive learning is based on inferring a general rule from a finite data set and using it to labe...
We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss function...
Generalised Bayesian learning algorithms are increasingly popular in machine learning, due to their ...
This tutorial gives a concise overview of existing PAC-Bayesian theory focusing on three generalizat...
International audiencePAC-Bayesian bounds are known to be tight and informative when studying the ge...
Meta-learning can successfully acquire useful inductive biases from data. Yet, its generalization pr...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
A PAC teaching model -under helpful distributions -is proposed which introduces the classical ideas...
We apply PAC-Bayesian theory to prove a generalization bound for the case of sequential task solving...
The common method to understand and improve classification rules is to prove bounds on the generaliz...
International audiencePAC-Bayesian learning bounds are of the utmost interest to the learning commun...
Transfer learning has received a lot of attention in the machine learning community over the last ye...
Inductive learning is based on inferring a general rule from a finite data set and using it to label...
Inductive learning is based on inferring a general rule from a finite data set and using it to labe...
PAC-Bayesian bounds are known to be tight and informative when studying the generalization ability o...
Inductive learning is based on inferring a general rule from a finite data set and using it to labe...
We present new PAC-Bayesian generalisation bounds for learning problems with unbounded loss function...
Generalised Bayesian learning algorithms are increasingly popular in machine learning, due to their ...
This tutorial gives a concise overview of existing PAC-Bayesian theory focusing on three generalizat...
International audiencePAC-Bayesian bounds are known to be tight and informative when studying the ge...
Meta-learning can successfully acquire useful inductive biases from data. Yet, its generalization pr...
Risk bounds, which are also called generalisation bounds in the statistical learning literature, are...
A PAC teaching model -under helpful distributions -is proposed which introduces the classical ideas...
We apply PAC-Bayesian theory to prove a generalization bound for the case of sequential task solving...
The common method to understand and improve classification rules is to prove bounds on the generaliz...
International audiencePAC-Bayesian learning bounds are of the utmost interest to the learning commun...
Transfer learning has received a lot of attention in the machine learning community over the last ye...