We study the learnability of sets in Rn under the Gaussian distribution, taking Gaussian surface area as the “complexity measure ” of the sets being learned. Let CS denote the class of all (measurable) sets with surface area at most S. We first show that the class CS is learnable to any constant accuracy in time nO(S 2), even in the arbitrary noise (“agnostic”) model. Complementing this, we also show that any learning algorithm for CS information-theoretically requires 2Ω(S2) examples for learning to constant accuracy. These results together show that Gaussian surface area essentially characterizes the computa-tional complexity of learning under the Gaussian distribution. Our approach yields several new learning results, including the follo...
Mixtures of gaussian (or normal) distributions arise in a variety of application areas. Many techniq...
This paper studies the sample complexity of learning the $k$ unknown centers of a balanced Gaussian ...
We consider the sample complexity of agnostic learning with respect to squared loss. It is known th...
We study the learnability of sets in Rn under the Gaussian distribution, taking Gaussian surface are...
We give the first representation-independent hardness result for agnostically learning halfspaces wi...
We give the first representation-independent hardness result for agnostically learning halfspaces wi...
AbstractIntrinsic complexity is used to measure the complexity of learning areas limited by broken-s...
There are many high dimensional function classes that have fast agnostic learning algorithms when as...
Abstract. The complexity of on-line learning is investigated for the basic classes of geometrical ob...
AbstractValiant's protocol for learning is extended to the case where the distribution of the exampl...
AbstractGiven a set F of classifiers and a probability distribution over their domain, one can defin...
AbstractWe present several efficient parallel algorithms for PAC-learning geometric concepts in a co...
We show that if the closure of a function class F under the metric induced by some probability distr...
A general mathematical framework is developed for learning algorithms. A learning task belongs to ei...
We give the first algorithm that (under distributional assumptions) efficiently learns halfspaces in...
Mixtures of gaussian (or normal) distributions arise in a variety of application areas. Many techniq...
This paper studies the sample complexity of learning the $k$ unknown centers of a balanced Gaussian ...
We consider the sample complexity of agnostic learning with respect to squared loss. It is known th...
We study the learnability of sets in Rn under the Gaussian distribution, taking Gaussian surface are...
We give the first representation-independent hardness result for agnostically learning halfspaces wi...
We give the first representation-independent hardness result for agnostically learning halfspaces wi...
AbstractIntrinsic complexity is used to measure the complexity of learning areas limited by broken-s...
There are many high dimensional function classes that have fast agnostic learning algorithms when as...
Abstract. The complexity of on-line learning is investigated for the basic classes of geometrical ob...
AbstractValiant's protocol for learning is extended to the case where the distribution of the exampl...
AbstractGiven a set F of classifiers and a probability distribution over their domain, one can defin...
AbstractWe present several efficient parallel algorithms for PAC-learning geometric concepts in a co...
We show that if the closure of a function class F under the metric induced by some probability distr...
A general mathematical framework is developed for learning algorithms. A learning task belongs to ei...
We give the first algorithm that (under distributional assumptions) efficiently learns halfspaces in...
Mixtures of gaussian (or normal) distributions arise in a variety of application areas. Many techniq...
This paper studies the sample complexity of learning the $k$ unknown centers of a balanced Gaussian ...
We consider the sample complexity of agnostic learning with respect to squared loss. It is known th...