This paper introduces a new method to automatically, rapidly and reliably evaluate the class conditional probability of any subset of variables in supervised learning. It is based on a partitioning of each input variable into intervals in the numerical case and into groups of values in the categorical case. The cross-product of the univariate partitions forms a multivariate partition of the input representation space into a set of cells. This multivariate partition, called data grid, is a piecewise constant nonparametric estimator of the class conditional probability. The best data grid is searched using a Bayesian model selection approach and an efficient combinatorial algorithm. We also extend data grids to joint density estimation in uns...
This thesis develops models and associated Bayesian inference methods for flexible univariate and mu...
We propose a general partition-based strategy to estimate conditional density with candidate densiti...
Latent variable models are used extensively in unsupervised learning within the Bayesian paradigm, t...
The data preparation step of the data mining process represents 80% of the problem and is both time ...
Abstract In the data preparation phase of data mining, supervised discretization and value grouping ...
We introduce Multi-Conditional Learning, a framework for optimizing graphical models based not on jo...
Nonparametric estimation of the conditional distribution of a response given high-dimensional featur...
In this paper, we consider the supervised learning task which consists in predicting the normalized ...
Probabilistic label learning is a challenging task that arises from recent real-world problems withi...
We propose a simple taxonomy of probabilistic graphical models for the semi-supervised learning prob...
As machine learning gains significant attention in many disciplines and research communities, the v...
We study the problem of learning Bayesian classifiers (BC)when the true class label of the training ...
One desirable property of machine learning algorithms is the ability to balance the number of p...
International audienceA new supervised learning algorithm using naïve Bayesian classifier is present...
In this paper we present a histogram-like estimator of a conditional density that uses super learner...
This thesis develops models and associated Bayesian inference methods for flexible univariate and mu...
We propose a general partition-based strategy to estimate conditional density with candidate densiti...
Latent variable models are used extensively in unsupervised learning within the Bayesian paradigm, t...
The data preparation step of the data mining process represents 80% of the problem and is both time ...
Abstract In the data preparation phase of data mining, supervised discretization and value grouping ...
We introduce Multi-Conditional Learning, a framework for optimizing graphical models based not on jo...
Nonparametric estimation of the conditional distribution of a response given high-dimensional featur...
In this paper, we consider the supervised learning task which consists in predicting the normalized ...
Probabilistic label learning is a challenging task that arises from recent real-world problems withi...
We propose a simple taxonomy of probabilistic graphical models for the semi-supervised learning prob...
As machine learning gains significant attention in many disciplines and research communities, the v...
We study the problem of learning Bayesian classifiers (BC)when the true class label of the training ...
One desirable property of machine learning algorithms is the ability to balance the number of p...
International audienceA new supervised learning algorithm using naïve Bayesian classifier is present...
In this paper we present a histogram-like estimator of a conditional density that uses super learner...
This thesis develops models and associated Bayesian inference methods for flexible univariate and mu...
We propose a general partition-based strategy to estimate conditional density with candidate densiti...
Latent variable models are used extensively in unsupervised learning within the Bayesian paradigm, t...