AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on the number of examples required for Probably Approximately Correct (PAC) learning in the presence of noise. This approach deals directly with the fundamental information quantities, avoiding a Bayesian analysis. The technique is applied to several different models, illustrating its generality and power. The resulting bounds add logarithmic factors to (or improve the constants in) previously known lower bounds
Abstract—In this paper, a mathematical theory of learning is proposed that has many parallels with i...
We discuss more realistic models of computational learning. We extend the existing literature on the...
Learning systems are often provided with imperfect or noisy data. Therefore, researchers have formal...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
In recent years, tools from information theory have played an increasingly prevalent role in statist...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
AbstractThis paper focuses on a general setup for obtaining sample size lower bounds for learning co...
AbstractWe introduce a new model for learning in the presence of noise, which we call the Nasty Nois...
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept cl...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
AbstractThis paper considers a modification of a PAC learning theory problem in which each instance ...
International audienceA grand challenge in representation learning is the development of computation...
AbstractWe present a new general upper bound on the number of examples required to estimate all of t...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
We investigate learnability in the PAC model when the data used for learning, attributes and labels,...
Abstract—In this paper, a mathematical theory of learning is proposed that has many parallels with i...
We discuss more realistic models of computational learning. We extend the existing literature on the...
Learning systems are often provided with imperfect or noisy data. Therefore, researchers have formal...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
In recent years, tools from information theory have played an increasingly prevalent role in statist...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
AbstractThis paper focuses on a general setup for obtaining sample size lower bounds for learning co...
AbstractWe introduce a new model for learning in the presence of noise, which we call the Nasty Nois...
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept cl...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
AbstractThis paper considers a modification of a PAC learning theory problem in which each instance ...
International audienceA grand challenge in representation learning is the development of computation...
AbstractWe present a new general upper bound on the number of examples required to estimate all of t...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
We investigate learnability in the PAC model when the data used for learning, attributes and labels,...
Abstract—In this paper, a mathematical theory of learning is proposed that has many parallels with i...
We discuss more realistic models of computational learning. We extend the existing literature on the...
Learning systems are often provided with imperfect or noisy data. Therefore, researchers have formal...