One important disadvantage of decision tree based inductive learning algorithms is that they use some irrelevant values to establish the decision tree. This causes the final rule set to be less general. To overcome with this problem the tree has to be pruned. In this article using the recently developed RULES inductive learning algorithm, pruning of a decision tree is explained. The decision tree is extracted for an example problem using the ID3 algorithm and then is pruned using RULES. The results obtained before and after pruning are compared. This shows that the pruned decision tree is more general
Abstract. Decision tree learning represents a well known family of inductive learning algo-rithms th...
An additive quality measure based on information theory is introduced for the inductive inference of...
To date, decision trees are among the most used classification models. They owe their popularity to ...
One important disadvantage of decision tree based inductive learning algorithms is that they use som...
Induction methods have recently been found to be useful in a wide variety of business related proble...
Inductive learning enables the system to recognize patterns and regularities in previous knowledge o...
Pre-Pruning and Post-Pruning are two standard methods of dealing with noise in decision tree learnin...
This paper compares five methods for pruning decision trees, developed from sets of examples. When u...
Top-down induction of decision trees has been observed to suer from the inadequate functioning of th...
The pruning phase is one of the necessary steps in decision tree induction. Existing pruning algorit...
Abstract — Decision trees are few of the most extensively researched domains in Knowledge Discovery....
There exist several methods for transforming decision trees to neural networks. These methods typica...
Pruning is one of the key procedures in training decision tree classifiers. It removes trivial rules...
In this paper, we address the problem of retrospectively pruning decision trees induced from data, a...
This paper extends recent work on decision tree grafting. Grafting is an inductive process that adds...
Abstract. Decision tree learning represents a well known family of inductive learning algo-rithms th...
An additive quality measure based on information theory is introduced for the inductive inference of...
To date, decision trees are among the most used classification models. They owe their popularity to ...
One important disadvantage of decision tree based inductive learning algorithms is that they use som...
Induction methods have recently been found to be useful in a wide variety of business related proble...
Inductive learning enables the system to recognize patterns and regularities in previous knowledge o...
Pre-Pruning and Post-Pruning are two standard methods of dealing with noise in decision tree learnin...
This paper compares five methods for pruning decision trees, developed from sets of examples. When u...
Top-down induction of decision trees has been observed to suer from the inadequate functioning of th...
The pruning phase is one of the necessary steps in decision tree induction. Existing pruning algorit...
Abstract — Decision trees are few of the most extensively researched domains in Knowledge Discovery....
There exist several methods for transforming decision trees to neural networks. These methods typica...
Pruning is one of the key procedures in training decision tree classifiers. It removes trivial rules...
In this paper, we address the problem of retrospectively pruning decision trees induced from data, a...
This paper extends recent work on decision tree grafting. Grafting is an inductive process that adds...
Abstract. Decision tree learning represents a well known family of inductive learning algo-rithms th...
An additive quality measure based on information theory is introduced for the inductive inference of...
To date, decision trees are among the most used classification models. They owe their popularity to ...