Some of the tightest information-theoretic generalization bounds depend on the average information between the learned hypothesis and a \emph{single} training example. However, these sample-wise bounds were derived only for \emph{expected} generalization gap. We show that even for expected \emph{squared} generalization gap no such sample-wise information-theoretic bounds exist. The same is true for PAC-Bayes and single-draw bounds. Remarkably, PAC-Bayes, single-draw and expected squared generalization gap bounds that depend on information in pairs of examples exist
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
AbstractThe Information Bottleneck is an information theoretic framework that finds concise represen...
Various approaches have been developed to upper bound the generalization error of a supervised learn...
During the past decade, machine learning techniques have achieved impressive results in a number of ...
Machine learning has achieved impressive feats in numerous domains, largely driven by the emergence ...
We present a general approach to deriving bounds on the generalization error of randomized learning ...
Generalization error bounds are critical to understanding the performance of machine learning models...
We present a new family of information-theoretic generalization bounds, in which the training loss a...
Various approaches have been developed to upper bound the generalization error of a supervised lear...
We consider information-theoretic bounds on the expected generalization error for statistical learni...
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) ...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
Bounding the generalization error of learning algorithms has a long history, which yet falls short i...
We consider information-theoretic bounds on expected generalization error for statistical learning p...
We give a novel, unified derivation of conditional PAC-Bayesian and mutual information (MI) generali...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
AbstractThe Information Bottleneck is an information theoretic framework that finds concise represen...
Various approaches have been developed to upper bound the generalization error of a supervised learn...
During the past decade, machine learning techniques have achieved impressive results in a number of ...
Machine learning has achieved impressive feats in numerous domains, largely driven by the emergence ...
We present a general approach to deriving bounds on the generalization error of randomized learning ...
Generalization error bounds are critical to understanding the performance of machine learning models...
We present a new family of information-theoretic generalization bounds, in which the training loss a...
Various approaches have been developed to upper bound the generalization error of a supervised lear...
We consider information-theoretic bounds on the expected generalization error for statistical learni...
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) ...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
Bounding the generalization error of learning algorithms has a long history, which yet falls short i...
We consider information-theoretic bounds on expected generalization error for statistical learning p...
We give a novel, unified derivation of conditional PAC-Bayesian and mutual information (MI) generali...
In this work, we construct generalization bounds to understand existing learning algorithms and prop...
AbstractThe Information Bottleneck is an information theoretic framework that finds concise represen...
Various approaches have been developed to upper bound the generalization error of a supervised learn...