In recent years, tools from information theory have played an increasingly prevalent role in statistical machine learning. In addition to developing efficient, computationally feasible algorithms for analyzing complex datasets, it is of theoretical importance to determine whether such algorithms are “optimal” in the sense that no other algorithm can lead to smaller statistical error. This paper provides a survey of various techniques used to derive information-theoretic lower bounds for estimation and learning. We focus on the settings of parameter and function estimation, community recovery, and online learning for multi-armed bandits. A common theme is that lower bounds are established by relating the statistical learning problem to a cha...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
Many machine learning approaches are characterized by information constraints on how they inter-act ...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
Classically, the performance of estimators in statistical learning problems is measured in terms of ...
This thesis documents three different contributions in statistical learning theory. They were develo...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
Transfer learning, or domain adaptation, is concerned with machine learning problems in which traini...
We consider information-theoretic bounds on the expected generalization error for statistical learni...
2018-08-14Mutual information (MI) has been successfully applied to a wide variety of domains due to ...
Classically, the performance of estimators in statistical learning problems is measured in terms of ...
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) ...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
Many machine learning approaches are characterized by information constraints on how they inter-act ...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
Classically, the performance of estimators in statistical learning problems is measured in terms of ...
This thesis documents three different contributions in statistical learning theory. They were develo...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
Transfer learning, or domain adaptation, is concerned with machine learning problems in which traini...
We consider information-theoretic bounds on the expected generalization error for statistical learni...
2018-08-14Mutual information (MI) has been successfully applied to a wide variety of domains due to ...
Classically, the performance of estimators in statistical learning problems is measured in terms of ...
The following problem is considered: given a joint distribution P XY and an event E, bound P XY (E) ...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
There are (at least) three approaches to quantifying information. The first, algorithmic information...
There are (at least) three approaches to quantifying information. The first, algorithmic information...