AbstractWe give an algorithm that with high probability properly learns random monotone DNF with t(n) terms of length ≈logt(n) under the uniform distribution on the Boolean cube {0,1}n. For any function t(n)≤poly(n) the algorithm runs in time poly(n,1/ϵ) and with high probability outputs an ϵ-accurate monotone DNF hypothesis. This is the first algorithm that can learn monotone DNF of arbitrary polynomial size in a reasonable average-case model of learning from random examples only. Our approach relies on the discovery and application of new Fourier properties of monotone functions which may be of independent interest
Much work has been done on learning various classes of “simple ” monotone functions under the unifor...
AbstractWe present a membership-query algorithm for efficiently learning DNF with respect to the uni...
function and let C be a concept class where each concept has size at most t. Define opt = min c∈C Pr...
AbstractWe give an algorithm that with high probability properly learns random monotone DNF with t(n...
In 1984 Valiant introduced the distribution-independent model of Probably Approximately Correct (PAC...
We consider a model of learning Boolean functions from examples generated by a uniform random walk o...
A longstanding lacuna in the field of computational learning theory is the learnability of succinctl...
We give an algorithm that learns any monotone Boolean function f: f1; 1gn! f1; 1g to any constant ac...
AbstractWe show that the class of monotone 2O(logn)-term DNF formulae can be PAC learned in polynomi...
We give an algorithm that learns any monotone Boolean function f: {−1, 1}n → {−1, 1} to any constant...
We show how to learn in polynomial time monotone d-term DNF formulae (formulae in disjunctive normal...
We consider the problem of learning monotone Boolean functions over under the uniform distributi...
AbstractWe show how to learn in polynomial time monotone d-term DNF formulae (formulae in disjunctiv...
Over the years a range of positive algorithmic results have been obtained for learning various class...
AbstractIn this paper, we prove two general theorems on monotone Boolean functions which are useful ...
Much work has been done on learning various classes of “simple ” monotone functions under the unifor...
AbstractWe present a membership-query algorithm for efficiently learning DNF with respect to the uni...
function and let C be a concept class where each concept has size at most t. Define opt = min c∈C Pr...
AbstractWe give an algorithm that with high probability properly learns random monotone DNF with t(n...
In 1984 Valiant introduced the distribution-independent model of Probably Approximately Correct (PAC...
We consider a model of learning Boolean functions from examples generated by a uniform random walk o...
A longstanding lacuna in the field of computational learning theory is the learnability of succinctl...
We give an algorithm that learns any monotone Boolean function f: f1; 1gn! f1; 1g to any constant ac...
AbstractWe show that the class of monotone 2O(logn)-term DNF formulae can be PAC learned in polynomi...
We give an algorithm that learns any monotone Boolean function f: {−1, 1}n → {−1, 1} to any constant...
We show how to learn in polynomial time monotone d-term DNF formulae (formulae in disjunctive normal...
We consider the problem of learning monotone Boolean functions over under the uniform distributi...
AbstractWe show how to learn in polynomial time monotone d-term DNF formulae (formulae in disjunctiv...
Over the years a range of positive algorithmic results have been obtained for learning various class...
AbstractIn this paper, we prove two general theorems on monotone Boolean functions which are useful ...
Much work has been done on learning various classes of “simple ” monotone functions under the unifor...
AbstractWe present a membership-query algorithm for efficiently learning DNF with respect to the uni...
function and let C be a concept class where each concept has size at most t. Define opt = min c∈C Pr...