We consider the problem of learning accurate models from multiple sources of “nearby ” data. Given distinct samples from multiple data sources and estimates of the dissimilarities between these sources, we provide a general theory of which samples should be used to learn models for each source. This theory is applicable in a broad decision-theoretic learning framework, and yields results for classification and regression generally, and for density estimation within the exponential family. A key component of our approach is the development of approximate triangle inequalities for expected loss, which may be of independent interest.
We consider the general problem of Multiple Model Learning (MML) from data, from the statistical and...
How can we select the best performing data-driven model? How can we rigorously estimate its generali...
Several statistical problems can be described as estimation problem, where the goal is to learn a se...
We consider the problem of learning accurate models from multiple sources of nearby data. Given di...
We initiate the study of learning from multiple sources of limited data, each of which may be corrup...
We initiate the study of learning from multiple sources of limited data, each of which may be corru...
This paper examines the problem of learning from examples in a framework that is based on, but more ...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
Most previous work on multiple models has been done on a few domains. We present a comparsion of thr...
115 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 1993.This thesis examines issues r...
We provide a decision theoretic approach to the construction of a learning process in the presence o...
We propose two models, one continuous and one categorical, to learn about dependence between two ran...
Discriminative learning methods for classification perform well when training and test data are draw...
Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of...
What happens to the optimal interpretation of noisy data when there exists more than one equally pla...
We consider the general problem of Multiple Model Learning (MML) from data, from the statistical and...
How can we select the best performing data-driven model? How can we rigorously estimate its generali...
Several statistical problems can be described as estimation problem, where the goal is to learn a se...
We consider the problem of learning accurate models from multiple sources of nearby data. Given di...
We initiate the study of learning from multiple sources of limited data, each of which may be corrup...
We initiate the study of learning from multiple sources of limited data, each of which may be corru...
This paper examines the problem of learning from examples in a framework that is based on, but more ...
Distributed learning of probabilistic models from multiple data repositories with minimum communicat...
Most previous work on multiple models has been done on a few domains. We present a comparsion of thr...
115 p.Thesis (Ph.D.)--University of Illinois at Urbana-Champaign, 1993.This thesis examines issues r...
We provide a decision theoretic approach to the construction of a learning process in the presence o...
We propose two models, one continuous and one categorical, to learn about dependence between two ran...
Discriminative learning methods for classification perform well when training and test data are draw...
Abstract. We introduce a new formal model in which a learning algorithm must combine a collection of...
What happens to the optimal interpretation of noisy data when there exists more than one equally pla...
We consider the general problem of Multiple Model Learning (MML) from data, from the statistical and...
How can we select the best performing data-driven model? How can we rigorously estimate its generali...
Several statistical problems can be described as estimation problem, where the goal is to learn a se...