Transfer learning, or domain adaptation, is concerned with machine learning problems in which training and testing data come from possibly different probability distributions. In this work, we give an information-theoretic analysis on the generalization error and excess risk of transfer learning algorithms, following a line of work initiated by Russo and Xu. Our results suggest, perhaps as expected, that the Kullback-Leibler (KL) divergence $D(\mu||\mu')$ plays an important role in the characterizations where $\mu$ and $\mu'$ denote the distribution of the training data and the testing test, respectively. Specifically, we provide generalization error upper bounds for the empirical risk minimization (ERM) algorithm where data from both distr...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
We consider information-theoretic bounds on the expected generalization error for statistical learni...
Transfer learning, or domain adaptation, is concerned with machine learning problems in which traini...
This paper uses information-theoretic tools to analyze the generalization error in unsupervised doma...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
In recent years, tools from information theory have played an increasingly prevalent role in statist...
Generalization error bounds are critical to understanding the performance of machine learning models...
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
A recent line of works, initiated by Russo and Xu, has shown that the generalization error of a lear...
We characterize the statistical efficiency of knowledge transfer through $n$ samples from a teacher ...
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
We consider information-theoretic bounds on the expected generalization error for statistical learni...
Transfer learning, or domain adaptation, is concerned with machine learning problems in which traini...
This paper uses information-theoretic tools to analyze the generalization error in unsupervised doma...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
In recent years, tools from information theory have played an increasingly prevalent role in statist...
Generalization error bounds are critical to understanding the performance of machine learning models...
In this work, the probability of an event under some joint distribution is bounded by measuring it w...
A recent line of works, initiated by Russo and Xu, has shown that the generalization error of a lear...
We characterize the statistical efficiency of knowledge transfer through $n$ samples from a teacher ...
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
International audienceDomain adaptation (DA) is an important and emerging field of machine learning ...
We consider information-theoretic bounds on the expected generalization error for statistical learni...