For many types of learners one can compute the statistically "op-timal " way to select data. We review how these techniques have been used with feedforward neural networks [MacKay, 1992; Cohn, 1994]. We then show how the same principles may be used to select data for two alternative, statistically-based learning architectures: mixtures of Gaussians and locally weighted regression. While the techniques for neural networks are expensive and approximate, the techniques for mixtures of Gaussians and locally weighted regres-sion are both efficient and accurate
Labeled data can be expensive to acquire in several application domains, including medical imaging, ...
Providing a broad but in-depth introduction to neural network and machine learning in a statistical ...
Active learning (AL) is a branch of machine learning that deals with problems where unlabeled data i...
For many types of machine learning algorithms, one can compute the statistically \op-timal " wa...
For many types of learners one can compute the statistically 'optimal' way to select data. We revi...
Abstract We discuss a new paradigm for supervised learning that aims at improving the efficiency of ...
Hasenjäger M. Active data selection in supervised and unsupervised learning. Bielefeld: Bielefeld Un...
A statistically-based algorithm for pruning weights from feed-forward networks is presented. This a...
International audienceIn the context of supervised learning of a function by a neural network, we cl...
In this work, we face the problem of training sample collection for the estimation of biophysical pa...
In terms of the Bias/Variance decomposition, very flexible (i.e., complex) Supervised Machine Learni...
This paper presents a rigorous statistical analysis characterizing regimes in which active learning ...
This textbook considers statistical learning applications when interest centers on the conditional d...
Optimal active learning refers to a framework where the learner actively selects data points to be a...
Most active learning methods avoid model selection by training models of one type (SVMs, boosted tre...
Labeled data can be expensive to acquire in several application domains, including medical imaging, ...
Providing a broad but in-depth introduction to neural network and machine learning in a statistical ...
Active learning (AL) is a branch of machine learning that deals with problems where unlabeled data i...
For many types of machine learning algorithms, one can compute the statistically \op-timal " wa...
For many types of learners one can compute the statistically 'optimal' way to select data. We revi...
Abstract We discuss a new paradigm for supervised learning that aims at improving the efficiency of ...
Hasenjäger M. Active data selection in supervised and unsupervised learning. Bielefeld: Bielefeld Un...
A statistically-based algorithm for pruning weights from feed-forward networks is presented. This a...
International audienceIn the context of supervised learning of a function by a neural network, we cl...
In this work, we face the problem of training sample collection for the estimation of biophysical pa...
In terms of the Bias/Variance decomposition, very flexible (i.e., complex) Supervised Machine Learni...
This paper presents a rigorous statistical analysis characterizing regimes in which active learning ...
This textbook considers statistical learning applications when interest centers on the conditional d...
Optimal active learning refers to a framework where the learner actively selects data points to be a...
Most active learning methods avoid model selection by training models of one type (SVMs, boosted tre...
Labeled data can be expensive to acquire in several application domains, including medical imaging, ...
Providing a broad but in-depth introduction to neural network and machine learning in a statistical ...
Active learning (AL) is a branch of machine learning that deals with problems where unlabeled data i...