We study online learning in Boolean domains using kernels which capture feature expansions equivalent to using conjunctions over basic features. We demonstrate a tradeoff between the computational efficiency with which these kernels can be computed and the generalization ability of the resulting classifier. We first describe several kernel functions which capture either limited forms of conjunctions or all conjunctions. We show that these kernels can be used to efficiently run the Perceptron algorithm over an exponential number of conjunctions; however we also prove that using such kernels the Perceptron algorithm can make an exponential number of mistakes even when learning simple functions. We also consider an analogous use of kernel func...
Expanding the learning problems\u27 input spaces to high-dimensional feature spaces can increase exp...
Valiant (1984) and others have studied the problem of learning vari-ous classes of Boolean functions...
AbstractWe consider a fundamental problem in computational learning theory: learning an arbitrary Bo...
We study online learning in Boolean domains using kernels which capture feature expansions equivalen...
We study online learning in Boolean domains using kernels which cap-ture feature expansions equivale...
The paper studies machine learning problems where each example is described using a set of Boolean f...
A common problem of kernel-based online algorithms, such as the kernel-based Perceptron algorithm, i...
We consider a fundamental problem in computational learning theory: learning an arbitrary Boolean f...
Kernel methods are popular nonparametric modeling tools in machine learning. The Mercer kernel funct...
The Perceptron algorithm, despite its simplicity, often performs well on online classification tasks...
We consider the problem of learning a vector-valued function f in an online learning setting. The fu...
We consider online learning in a Reproducing Kernel Hilbert Space. Our method is computationally ef...
We consider online learning in a Reproducing Kernel Hilbert Space. Our method is computationally eff...
Recent work has introduced Boolean kernels with which one can learn linear threshold functions over ...
We give results about the learnability and required complexity of logical formulae to solve classifi...
Expanding the learning problems\u27 input spaces to high-dimensional feature spaces can increase exp...
Valiant (1984) and others have studied the problem of learning vari-ous classes of Boolean functions...
AbstractWe consider a fundamental problem in computational learning theory: learning an arbitrary Bo...
We study online learning in Boolean domains using kernels which capture feature expansions equivalen...
We study online learning in Boolean domains using kernels which cap-ture feature expansions equivale...
The paper studies machine learning problems where each example is described using a set of Boolean f...
A common problem of kernel-based online algorithms, such as the kernel-based Perceptron algorithm, i...
We consider a fundamental problem in computational learning theory: learning an arbitrary Boolean f...
Kernel methods are popular nonparametric modeling tools in machine learning. The Mercer kernel funct...
The Perceptron algorithm, despite its simplicity, often performs well on online classification tasks...
We consider the problem of learning a vector-valued function f in an online learning setting. The fu...
We consider online learning in a Reproducing Kernel Hilbert Space. Our method is computationally ef...
We consider online learning in a Reproducing Kernel Hilbert Space. Our method is computationally eff...
Recent work has introduced Boolean kernels with which one can learn linear threshold functions over ...
We give results about the learnability and required complexity of logical formulae to solve classifi...
Expanding the learning problems\u27 input spaces to high-dimensional feature spaces can increase exp...
Valiant (1984) and others have studied the problem of learning vari-ous classes of Boolean functions...
AbstractWe consider a fundamental problem in computational learning theory: learning an arbitrary Bo...