Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of potentially poor but statistically independent hypothesis functions in order to approximate an unknown target function arbitrarily well. Our motivation includes the question of how to make optimal use of multiple independent runs of a mediocre learning algorithm, as well as settings in which the many hypotheses are obtained by a distributed population of identical learning agents
One of the core applications of machine learning to knowledge discovery consists on building a func...
We consider a natural framework of learning from correlated data, in which successive examples used ...
We consider an optimization problem in probabilistic inference: Given n hypotheses, m possible obser...
Editor: Abstract. We introduce a new formal model in which a learning algorithm must combine a colle...
Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of...
In this dissertation, we explore two fundamental sets of inference problems arising in machine learn...
In this work we consider a setting where we have a very large number of related tasks with few examp...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
This thesis makes contributions to two problems in learning theory: prediction with expert advice an...
It has recently been shown that a Bayesian agent with a universal hypothesis class resolves most ind...
This paper examines the problem of learning from examples in a framework that is based on, but more ...
A major problem in machine learning is that of inductive bias: how to choose a learner’s hy-pothesis...
We present new tools from probability theory that can be applied to the analysis of learning algorit...
This paper considers the problem of learning the ranking of a set of stochastic alterna-tives based ...
One of the core applications of machine learning to knowledge discovery consists on building a func...
We consider a natural framework of learning from correlated data, in which successive examples used ...
We consider an optimization problem in probabilistic inference: Given n hypotheses, m possible obser...
Editor: Abstract. We introduce a new formal model in which a learning algorithm must combine a colle...
Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of...
In this dissertation, we explore two fundamental sets of inference problems arising in machine learn...
In this work we consider a setting where we have a very large number of related tasks with few examp...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
This thesis makes contributions to two problems in learning theory: prediction with expert advice an...
It has recently been shown that a Bayesian agent with a universal hypothesis class resolves most ind...
This paper examines the problem of learning from examples in a framework that is based on, but more ...
A major problem in machine learning is that of inductive bias: how to choose a learner’s hy-pothesis...
We present new tools from probability theory that can be applied to the analysis of learning algorit...
This paper considers the problem of learning the ranking of a set of stochastic alterna-tives based ...
One of the core applications of machine learning to knowledge discovery consists on building a func...
We consider a natural framework of learning from correlated data, in which successive examples used ...
We consider an optimization problem in probabilistic inference: Given n hypotheses, m possible obser...