The most important feature of a classifier is its generalisation capability. It depends on the correct estimate of both the parameters and the structural complexity of the network. When dealing with Bayesian classifiers, such a problem is usually related to the optimisation of different instances of the EM algorithm, which is used in the Maximum Likelihood approach. At this regard, we propose an EM-based algorithm that performs this optimisation task with a relevant reduction of the computational cost without loss of the classification accuracy. This is obtained by using a hierarchical growing approach, based on a given splitting procedure, which determines in a more efficient way the overall structural complexity of the classifier and t...
International audienceMarkov chain Monte Carlo (MCMC) methods are an important class of computation ...
Evolutionary computation is a discipline that has been emerging for at least 40 or 50 years. All met...
Incorporating subset selection into a classification method often carries a num-ber of advantages, e...
AbstractThe use of Bayesian Networks (BNs) as classifiers in different fields of application has rec...
The use of Bayesian networks for classification problems has received significant recent attention. ...
AbstractThe margin criterion for parameter learning in graphical models gained significant impact ov...
The Hierarchical mixture of experts(HME) architecture is a powerful tree structured architecture for...
In this paper an algorithm based on the concepts of genetic algorithms that uses an estimation of a...
We study hierarchical classification in the general case when an instance could belong to more than ...
This work proposes and discusses an approach for inducing Bayesian classifiers aimed at balancing th...
I consider a binary classification problem with a feature vector of high dimensionality. Spam mail f...
For an augmented Bayesian network classifier we propose a method of scoring a set of feature nodes f...
Abstract. This paper presents the method of significantly improving conventional Bayesian statistica...
We introduce the family of multi-dimensional Bayesian network classifiers. These clas-sifiers includ...
In previous work, we have seen how to learn a TAN classifier from incomplete dataset using the Expec...
International audienceMarkov chain Monte Carlo (MCMC) methods are an important class of computation ...
Evolutionary computation is a discipline that has been emerging for at least 40 or 50 years. All met...
Incorporating subset selection into a classification method often carries a num-ber of advantages, e...
AbstractThe use of Bayesian Networks (BNs) as classifiers in different fields of application has rec...
The use of Bayesian networks for classification problems has received significant recent attention. ...
AbstractThe margin criterion for parameter learning in graphical models gained significant impact ov...
The Hierarchical mixture of experts(HME) architecture is a powerful tree structured architecture for...
In this paper an algorithm based on the concepts of genetic algorithms that uses an estimation of a...
We study hierarchical classification in the general case when an instance could belong to more than ...
This work proposes and discusses an approach for inducing Bayesian classifiers aimed at balancing th...
I consider a binary classification problem with a feature vector of high dimensionality. Spam mail f...
For an augmented Bayesian network classifier we propose a method of scoring a set of feature nodes f...
Abstract. This paper presents the method of significantly improving conventional Bayesian statistica...
We introduce the family of multi-dimensional Bayesian network classifiers. These clas-sifiers includ...
In previous work, we have seen how to learn a TAN classifier from incomplete dataset using the Expec...
International audienceMarkov chain Monte Carlo (MCMC) methods are an important class of computation ...
Evolutionary computation is a discipline that has been emerging for at least 40 or 50 years. All met...
Incorporating subset selection into a classification method often carries a num-ber of advantages, e...