<p>In most machine learning approaches, it is usually assumed that data are complete. When data are partially missing due to various reasons, for example, the failure of a subset of sensors, image corruption or inadequate medical measurements, many learning methods designed for complete data cannot be directly applied. In this dissertation we treat two kinds of problems with incomplete data using non-parametric Bayesian approaches: classification with incomplete features and analysis of low-rank matrices with missing entries.</p><p>Incomplete data in classification problems are handled by assuming input features to be generated from a mixture-of-experts model, with each individual expert (classifier) defined by a local Gaussian in feature s...
We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonl...
Abstract — We introduce a new method based on Bayesian Network formalism for automatically generatin...
AbstractNaive Bayes classifiers provide an efficient and scalable approach to supervised classificat...
Real-world learning tasks often involve high-dimensional data sets with complex patterns of missing ...
Abstract—We address the incomplete-data problem in which feature vectors to be classified are missin...
Learning, inference, and prediction in the presence of missing data are pervasive problems in machin...
Real-world learning tasks often involve high-dimensional data sets with complex patterns of missing...
We address the incomplete-data problem in which feature vectors to be classified are missing data (f...
This article describes a new approach to Bayesian selection of decomposabl e models with incomplete ...
Missing data occur frequently in surveys, clinical trials as well as other real data studies. In the...
We propose an efficient family of algorithms to learn the parameters of a Bayesian network from inco...
©2004 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for al...
We propose an efficient family of algorithms to learn the parameters of a Bayesian network from inco...
We present new algorithms for learning Bayesian networks from data with missing values using a data ...
We propose a family of efficient algorithms for learning the parameters of a Bayesian network from i...
We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonl...
Abstract — We introduce a new method based on Bayesian Network formalism for automatically generatin...
AbstractNaive Bayes classifiers provide an efficient and scalable approach to supervised classificat...
Real-world learning tasks often involve high-dimensional data sets with complex patterns of missing ...
Abstract—We address the incomplete-data problem in which feature vectors to be classified are missin...
Learning, inference, and prediction in the presence of missing data are pervasive problems in machin...
Real-world learning tasks often involve high-dimensional data sets with complex patterns of missing...
We address the incomplete-data problem in which feature vectors to be classified are missing data (f...
This article describes a new approach to Bayesian selection of decomposabl e models with incomplete ...
Missing data occur frequently in surveys, clinical trials as well as other real data studies. In the...
We propose an efficient family of algorithms to learn the parameters of a Bayesian network from inco...
©2004 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for al...
We propose an efficient family of algorithms to learn the parameters of a Bayesian network from inco...
We present new algorithms for learning Bayesian networks from data with missing values using a data ...
We propose a family of efficient algorithms for learning the parameters of a Bayesian network from i...
We introduce quadratically gated mixture of experts (QGME), a statistical model for multi-class nonl...
Abstract — We introduce a new method based on Bayesian Network formalism for automatically generatin...
AbstractNaive Bayes classifiers provide an efficient and scalable approach to supervised classificat...