this paper we present a general scheme for extending the VC-dimension to the case n ? 1. Our scheme defines a wide variety of notions of dimension in which several variants of the VC-dimension, previously introduced in the context of learning, appear as special cases. Our main result is a simple condition characterizing the set of notions of dimension whose finiteness is necessary and sufficient for learning. This provides a variety of new tools for determining the learnability of a class of multi-valued functions. Our characterization is also shown to hold in the "robust" variant of PAC model and for any "reasonable" loss function
AbstractWe present a new general-purpose algorithm for learning classes of [0, 1]-valued functions i...
AbstractWe consider the problem of learning a concept from examples in the distribution-free model b...
A proof that a concept is learnable provided the Vapnik-Chervonenkis dimension is finite is given. T...
AbstractWe investigate the PAC learnability of classes of {0, ..., n}-valued functions (n < ∞). For ...
In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the...
textabstractA stochastic model of learning from examples has been introduced by Valiant [1984]. This...
Learnability in Valiant's PAC learning model has been shown to be strongly related to the exist...
AbstractIn the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to esti...
AbstractWe consider the problem of learning real-valued functions from random examples when the func...
Proc. European Conference on Machine Learning, Lecture Notes in Artificial Intelligence 784, 415-418...
Abstract. The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning the...
We consider the problem of learning a concept from examples in the distribution-free model by Valian...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
We present a new general-purpose algorithm for learning classes of [0, 1]-valued functions in a gene...
AbstractWe present a new general-purpose algorithm for learning classes of [0, 1]-valued functions i...
AbstractWe consider the problem of learning a concept from examples in the distribution-free model b...
A proof that a concept is learnable provided the Vapnik-Chervonenkis dimension is finite is given. T...
AbstractWe investigate the PAC learnability of classes of {0, ..., n}-valued functions (n < ∞). For ...
In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the...
textabstractA stochastic model of learning from examples has been introduced by Valiant [1984]. This...
Learnability in Valiant's PAC learning model has been shown to be strongly related to the exist...
AbstractIn the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to esti...
AbstractWe consider the problem of learning real-valued functions from random examples when the func...
Proc. European Conference on Machine Learning, Lecture Notes in Artificial Intelligence 784, 415-418...
Abstract. The Vapnik-Chervonenkis (VC) dimension plays an important role in statistical learning the...
We consider the problem of learning a concept from examples in the distribution-free model by Valian...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
In this paper, we introduce the discretized-Vapnik-Chervonenkis (VC) dimension for studying the comp...
We present a new general-purpose algorithm for learning classes of [0, 1]-valued functions in a gene...
AbstractWe present a new general-purpose algorithm for learning classes of [0, 1]-valued functions i...
AbstractWe consider the problem of learning a concept from examples in the distribution-free model b...
A proof that a concept is learnable provided the Vapnik-Chervonenkis dimension is finite is given. T...