Due to copyright restrictions, the access to the full text of this article is only available via subscription.Gaussian Processes (GPs) are effective Bayesian predictors. We here show for the first time that instance labels of a GP classifier can be inferred in the multiple instance learning (MIL) setting using variational Bayes. We achieve this via a new construction of the bag likelihood that assumes a large value if the instance predictions obey the MIL constraints and a small value otherwise. This construction lets us derive the update rules for the variational parameters analytically, assuring both scalable learning and fast convergence. We observe this model to improve the state of the art in instance label prediction from bag-level su...
We analyze and evaluate a generative process for multiple-instance learning (MIL) in which bags are ...
We study the problem of learning Bayesian classifiers (BC)when the true class label of the training ...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
We propose a generative Bayesian model that predicts instance labels from weak (bag-level) supervisi...
As a branch of machine learning, multiple instance learning (MIL) learns from a collection of labele...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Many real classification tasks are oriented to sequence (neighbor) la-beling, that is, assigning a l...
Multi-instance learning (MIL) deals with objects represented as bags of instances and can predict in...
We attack the problem of general object recognition by learning probabilistic, nonlinear object clas...
In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian proces...
We describe a variational Bayes (VB) learning algorithm for generalized autoregressive (GAR) models....
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
We analyze and evaluate a generative process for multiple-instance learning (MIL) in which bags are ...
We study the problem of learning Bayesian classifiers (BC)when the true class label of the training ...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...
We propose a generative Bayesian model that predicts instance labels from weak (bag-level) supervisi...
As a branch of machine learning, multiple instance learning (MIL) learns from a collection of labele...
Gaussian processes (GPs) are widely used in the Bayesian approach to supervised learning. Their abil...
Gaussian process (GP) models are powerful tools for Bayesian classification, but their limitation is...
Many real classification tasks are oriented to sequence (neighbor) la-beling, that is, assigning a l...
Multi-instance learning (MIL) deals with objects represented as bags of instances and can predict in...
We attack the problem of general object recognition by learning probabilistic, nonlinear object clas...
In this paper, we propose the variational EM inference algorithm for the multi-class Gaussian proces...
We describe a variational Bayes (VB) learning algorithm for generalized autoregressive (GAR) models....
Variational inference techniques based on inducing variables provide an elegant framework for scalab...
Gaussian processes (GP) provide an attrac-tive machine learning model due to their non-parametric fo...
Variational methods have been recently considered for scaling the training process of Gaussian proce...
We analyze and evaluate a generative process for multiple-instance learning (MIL) in which bags are ...
We study the problem of learning Bayesian classifiers (BC)when the true class label of the training ...
Often in machine learning, data are collected as a combination of multiple conditions, e.g., the voi...