Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree, the examples become rarer and the faith-fulness of entropy decreases. Thus, misleading choices and over-fitting may occur and the tree has to be adjusted by using an early-stop crite-rion or post pruning algorithms. However, these methods still depends on the choices previously made, which may be unsatisfactory. We propose a new cumulative entropy function based on con-fidence intervals on frequency estimates that to-gether considers the entropy of the probability distribution and the uncertainty around the es-timation of its parameters. This function takes advantage of the ability of a possibility distri-bution to upper bound a family of pr...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
ABSTRACT. We introduce an extension of the notion of Shannon conditional entropy to a more general f...
International audienceIn data mining, large differences in prior class probabilities known as the cl...
International audienceEntropy gain is widely used for learning decision trees. However, as we go dee...
Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree,...
Copyright © 2015 by the author(s). Entropy gain is widely used for learning decision trees. However,...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
In this paper we present a new entropy measure to grow decision trees. This measure has the characte...
In this paper we present a new entropy measure to grow decision trees. This measure has the characte...
In this paper, we consider decision trees that use both conventional queries based on one attribute ...
Decision tree classiers are a widely used tool in data stream mining. The use of condence intervals ...
As well-known machine learning methods, decision trees are widely applied in classification and reco...
In this work, we analyze the cross-entropy function, widely used in classifiers both as a performanc...
In this work, we analyze the cross-entropy function, widely used in classifiers both as a performanc...
In this paper we present a new entropy measure to grow decision trees. This measure has the characte...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
ABSTRACT. We introduce an extension of the notion of Shannon conditional entropy to a more general f...
International audienceIn data mining, large differences in prior class probabilities known as the cl...
International audienceEntropy gain is widely used for learning decision trees. However, as we go dee...
Entropy gain is widely used for learning decision trees. However, as we go deeper downward the tree,...
Copyright © 2015 by the author(s). Entropy gain is widely used for learning decision trees. However,...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
In this paper we present a new entropy measure to grow decision trees. This measure has the characte...
In this paper we present a new entropy measure to grow decision trees. This measure has the characte...
In this paper, we consider decision trees that use both conventional queries based on one attribute ...
Decision tree classiers are a widely used tool in data stream mining. The use of condence intervals ...
As well-known machine learning methods, decision trees are widely applied in classification and reco...
In this work, we analyze the cross-entropy function, widely used in classifiers both as a performanc...
In this work, we analyze the cross-entropy function, widely used in classifiers both as a performanc...
In this paper we present a new entropy measure to grow decision trees. This measure has the characte...
The combination of mathematical models and uncertainty measures can be applied in the area of data m...
ABSTRACT. We introduce an extension of the notion of Shannon conditional entropy to a more general f...
International audienceIn data mining, large differences in prior class probabilities known as the cl...