AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on the number of examples required for Probably Approximately Correct (PAC) learning in the presence of noise. This approach deals directly with the fundamental information quantities, avoiding a Bayesian analysis. The technique is applied to several different models, illustrating its generality and power. The resulting bounds add logarithmic factors to (or improve the constants in) previously known lower bounds
Some of the tightest information-theoretic generalization bounds depend on the average information b...
AbstractWe present a new general upper bound on the number of examples required to estimate all of t...
Minimax lower bounds for concept learning state, for example, that for each sample size $n$ and lear...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
AbstractWe introduce a new model for learning in the presence of noise, which we call the Nasty Nois...
AbstractWe prove a lower bound of Ω((1/ɛ)ln(1/δ)+VCdim(C)/ɛ) on the number of random examples requir...
In recent years, tools from information theory have played an increasingly prevalent role in statist...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept cl...
AbstractThis paper focuses on a general setup for obtaining sample size lower bounds for learning co...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
We discuss more realistic models of computational learning. We extend the existing literature on the...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
AbstractWe investigate the combination of two major challenges in computational learning: dealing wi...
The combination of two major challenges in machine learning is investi-gated: dealing with large amo...
Some of the tightest information-theoretic generalization bounds depend on the average information b...
AbstractWe present a new general upper bound on the number of examples required to estimate all of t...
Minimax lower bounds for concept learning state, for example, that for each sample size $n$ and lear...
AbstractThis paper presents a general information-theoretic approach for obtaining lower bounds on t...
AbstractWe introduce a new model for learning in the presence of noise, which we call the Nasty Nois...
AbstractWe prove a lower bound of Ω((1/ɛ)ln(1/δ)+VCdim(C)/ɛ) on the number of random examples requir...
In recent years, tools from information theory have played an increasingly prevalent role in statist...
AbstractThe PAC model of learning and its extension to real valued function classes provides a well-...
This paper focuses on a general setup for obtaining sample size lower bounds for learning concept cl...
AbstractThis paper focuses on a general setup for obtaining sample size lower bounds for learning co...
The focus of this thesis is on understanding machine learning algorithms from an information-theoret...
We discuss more realistic models of computational learning. We extend the existing literature on the...
The thesis explores efficient learning algorithms in settings which are more restrictive than the PA...
AbstractWe investigate the combination of two major challenges in computational learning: dealing wi...
The combination of two major challenges in machine learning is investi-gated: dealing with large amo...
Some of the tightest information-theoretic generalization bounds depend on the average information b...
AbstractWe present a new general upper bound on the number of examples required to estimate all of t...
Minimax lower bounds for concept learning state, for example, that for each sample size $n$ and lear...