Exhaustive search in relational learning is generally infeasible, therefore some form of heuristic search is usually employed, such as in FOIL[1]. On the other hand, so-called stochastic discrimination provides a framework for combining arbitrary numbers of weak classifiers (in this case randomly generated relational rules) in a way where accuracy improves with additional rules, even after maximal accuracy on the training data has been reached. [2] The weak classifiers must have a slightly higher probability of covering instances of their target class than of other classes. As the rules are also independent and identically distributed, the Central Limit theorem applies and as the number of weak classifiers/rules grows, coverages for differe...
A relational probability tree (RPT) is a type of decision tree that can be used for probabilistic cl...
Real-world data often involve objects that exhibit multiple relations. A typical learning problem re...
Clustering of relational data has so far received a lot less attention than classification of such d...
In the field of machine learning, methods for learning from single-table data have received much mor...
One of the obstacles to widely using first-order logic languages is the fact that relational inferen...
The efficiency of multi-relational data mining algorithms, addressing the problem of learning First ...
In relational learning, predictions for an individual are based not only on its own properties but a...
Random Forests have been shown to perform very well in propositional learning. FORF is an upgrade of...
There are a variety of methods for inducing predictive systems from observed data. Many of these met...
Dependency networks approximate a joint probability distribution over multiple random variables as a...
In this paper we investigate an approach to semi-supervised learning based on randomized proposition...
We live in a richly interconnected world and, not surprisingly, we generate richly interconnected da...
A relational probability tree (RPT) is a type of decision tree that can be used for probabilistic cl...
Real-world data often involve objects that exhibit multiple relations. A typical learning problem re...
Clustering of relational data has so far received a lot less attention than classification of such d...
In the field of machine learning, methods for learning from single-table data have received much mor...
One of the obstacles to widely using first-order logic languages is the fact that relational inferen...
The efficiency of multi-relational data mining algorithms, addressing the problem of learning First ...
In relational learning, predictions for an individual are based not only on its own properties but a...
Random Forests have been shown to perform very well in propositional learning. FORF is an upgrade of...
There are a variety of methods for inducing predictive systems from observed data. Many of these met...
Dependency networks approximate a joint probability distribution over multiple random variables as a...
In this paper we investigate an approach to semi-supervised learning based on randomized proposition...
We live in a richly interconnected world and, not surprisingly, we generate richly interconnected da...
A relational probability tree (RPT) is a type of decision tree that can be used for probabilistic cl...
Real-world data often involve objects that exhibit multiple relations. A typical learning problem re...
Clustering of relational data has so far received a lot less attention than classification of such d...