The marginal likelihood is a central tool for drawing Bayesian inference about the number of components in mixture models. It is often approximated since the exact form is unavailable. A bias in the approximation may be due to an incomplete exploration by a simulated Markov chain (e.g., a Gibbs sequence) of the collection of posterior modes, a phenomenon also known as lack of label switching, as all possible label permutations must be simulated by a chain in order to converge and hence overcome the bias. In an importance sampling approach, imposing label switching to the importance function results in an exponential increase of the computational cost with the number of components. In this paper, two importance sampling schemes are proposed ...
Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of Bayesian inference to models ...
Bayesian networks (BNs) offer a compact, intuitive, and efficient graphical representation of uncert...
International audienceMonte Carlo methods rely on random sampling to compute and approximate expecta...
The marginal likelihood is a central tool for drawing Bayesian inference about the number of compone...
In this paper, we propose an adaptive algorithm that iteratively updates both the weights and compon...
In the present paper we explore various approaches of computing model likelihoods from the MCMC outp...
Calculation of the marginal likelihood or evidence is a problem central to model selection and model...
Importance sampling methods can be iterated like MCMC algorithms, while being more robust against de...
Importance sampling involves approximation of functionals (such as expectations) of a target distrib...
A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation...
For importance sampling (IS), multiple proposals can be combined to address different aspects of a t...
We consider Bayesian inference by importance sampling when the likelihood is analytically intractabl...
textabstractA class of adaptive sampling methods is introduced for efficient posterior and predictiv...
What is the “best” model? The answer to this question lies in part in the eyes of the beholder, neve...
Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of Bayesian inference to models ...
Bayesian networks (BNs) offer a compact, intuitive, and efficient graphical representation of uncert...
International audienceMonte Carlo methods rely on random sampling to compute and approximate expecta...
The marginal likelihood is a central tool for drawing Bayesian inference about the number of compone...
In this paper, we propose an adaptive algorithm that iteratively updates both the weights and compon...
In the present paper we explore various approaches of computing model likelihoods from the MCMC outp...
Calculation of the marginal likelihood or evidence is a problem central to model selection and model...
Importance sampling methods can be iterated like MCMC algorithms, while being more robust against de...
Importance sampling involves approximation of functionals (such as expectations) of a target distrib...
A class of adaptive sampling methods is introduced for efficient posterior and predictive simulation...
For importance sampling (IS), multiple proposals can be combined to address different aspects of a t...
We consider Bayesian inference by importance sampling when the likelihood is analytically intractabl...
textabstractA class of adaptive sampling methods is introduced for efficient posterior and predictiv...
What is the “best” model? The answer to this question lies in part in the eyes of the beholder, neve...
Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of Bayesian inference to models ...
Bayesian networks (BNs) offer a compact, intuitive, and efficient graphical representation of uncert...
International audienceMonte Carlo methods rely on random sampling to compute and approximate expecta...