Editor: Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of potentially poor but statistically independent hypothesis functions in order to approximate an unknown target function arbitrarily well. Our motivation includes the question of how to make optimal use of multiple independent runs of a mediocre learning algorithm, as well as settings in which the many hypotheses are obtained by a distributed population of identical learning agents
We consider an optimization problem in probabilistic inference: Given n hypotheses, m possible obser...
Incremental learning from noisy data presents dual challenges: that of evaluating multiple hy-pothes...
AbstractSuppose that we are given n mutually exclusive hypotheses, m mutually exclusive possible obs...
Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of...
In this dissertation, we explore two fundamental sets of inference problems arising in machine learn...
In this work we consider a setting where we have a very large number of related tasks with few examp...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
A major problem in machine learning is that of inductive bias: how to choose a learner’s hy-pothesis...
This thesis makes contributions to two problems in learning theory: prediction with expert advice an...
It has recently been shown that a Bayesian agent with a universal hypothesis class resolves most ind...
This paper considers the problem of learning the ranking of a set of stochastic alterna-tives based ...
This paper examines the problem of learning from examples in a framework that is based on, but more ...
We present new tools from probability theory that can be applied to the analysis of learning algorit...
One of the core applications of machine learning to knowledge discovery consists on building a func...
We consider an optimization problem in probabilistic inference: Given n hypotheses, m possible obser...
Incremental learning from noisy data presents dual challenges: that of evaluating multiple hy-pothes...
AbstractSuppose that we are given n mutually exclusive hypotheses, m mutually exclusive possible obs...
Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of...
In this dissertation, we explore two fundamental sets of inference problems arising in machine learn...
In this work we consider a setting where we have a very large number of related tasks with few examp...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
Abstract. When dealing with datasets containing a billion instances or with sim-ulations that requir...
A major problem in machine learning is that of inductive bias: how to choose a learner’s hy-pothesis...
This thesis makes contributions to two problems in learning theory: prediction with expert advice an...
It has recently been shown that a Bayesian agent with a universal hypothesis class resolves most ind...
This paper considers the problem of learning the ranking of a set of stochastic alterna-tives based ...
This paper examines the problem of learning from examples in a framework that is based on, but more ...
We present new tools from probability theory that can be applied to the analysis of learning algorit...
One of the core applications of machine learning to knowledge discovery consists on building a func...
We consider an optimization problem in probabilistic inference: Given n hypotheses, m possible obser...
Incremental learning from noisy data presents dual challenges: that of evaluating multiple hy-pothes...
AbstractSuppose that we are given n mutually exclusive hypotheses, m mutually exclusive possible obs...