Discriminatively-trained probabilistic models are widely useful because of the latitude they afford in designing features. But training involves complex trade-offs among weights, which can be dangerous: a few highly-indicative features can swamp the contribution of many individually weaker features, causing their weights to be undertrained. Such a model is less robust, for the highly-indicative features may be noisy or missing in the test data. To ameliorate this \emph{weight undertraining}, we propose a new training method, called \emph{feature bagging}, in which separate models are trained on subsets of the original features, and combined using a mixture model or a product of experts. We evaluate feature bagging on linear-chain conditiona...
Bagging has been found to be successful in increasing the predictive performance of unstable classif...
In this paper, we investigate the method of stacked generalization in combining models derived from ...
We investigate machine learning techniques for coping with highly skewed class distributions in two ...
Discriminative probabilistic models are very popular in NLP because of the latitude they afford in d...
This paper analyses the relation between the use of similarity in MemoryBased Learning and the notio...
Bagging has been found to be successful in increasing the predictive performance of unstable classif...
We introduce a learning algorithm for the weights in a very common class of discrimination functions...
International audienceWe focus on the adaptation of boosting to representation spaces composed of di...
International audienceWe focus on the adaptation of boosting to representation spaces composed of di...
Abstract. The major hypothesis that we will be prove in this paper is that unsupervised learning tec...
Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the...
. Nearest-neighbor algorithms are known to depend heavily on their distance metric. In this paper, w...
In this paper, we propose lazy bagging (LB), which builds bootstrap replicate bags based on the char...
We investigate machine learning techniques for coping with highly skewed class distributions in two ...
Learning from imbalanced data is an important problem in data mining research. Much research has add...
Bagging has been found to be successful in increasing the predictive performance of unstable classif...
In this paper, we investigate the method of stacked generalization in combining models derived from ...
We investigate machine learning techniques for coping with highly skewed class distributions in two ...
Discriminative probabilistic models are very popular in NLP because of the latitude they afford in d...
This paper analyses the relation between the use of similarity in MemoryBased Learning and the notio...
Bagging has been found to be successful in increasing the predictive performance of unstable classif...
We introduce a learning algorithm for the weights in a very common class of discrimination functions...
International audienceWe focus on the adaptation of boosting to representation spaces composed of di...
International audienceWe focus on the adaptation of boosting to representation spaces composed of di...
Abstract. The major hypothesis that we will be prove in this paper is that unsupervised learning tec...
Breiman's bagging and Freund and Schapire's boosting are recent methods for improving the...
. Nearest-neighbor algorithms are known to depend heavily on their distance metric. In this paper, w...
In this paper, we propose lazy bagging (LB), which builds bootstrap replicate bags based on the char...
We investigate machine learning techniques for coping with highly skewed class distributions in two ...
Learning from imbalanced data is an important problem in data mining research. Much research has add...
Bagging has been found to be successful in increasing the predictive performance of unstable classif...
In this paper, we investigate the method of stacked generalization in combining models derived from ...
We investigate machine learning techniques for coping with highly skewed class distributions in two ...