This paper describes a method for estimating the marginal likelihood or Bayes fac-tors of Bayesian models using non-parametric importance sampling (“arrogance sam-pling”). This method can also be used to compute the normalizing constant of probabil-ity distributions. Because the required inputs are samples from the distribution to be normalized and the scaled density at those samples, this method may be a convenient replacement for the harmonic mean estimator. The method has been implemented in the open source R package margLikArrogance.
Fitting parameters of interest in an elegant and efficient way via analysis of experimental data is ...
Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesi...
The key quantity needed for Bayesian hypothesis testing and model selection is the marginal likeliho...
Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of Bayesian inference to models ...
Bayesian principle is conceptually simple and intuitively plausible to carry out but its numerical i...
Strategic choices for efficient and accurate evaluation of marginal likelihoods by means of Monte Ca...
We consider Bayesian inference by importance sampling when the likelihood is analytically intractabl...
The marginal likelihood, or model evidence, is a key quantity in Bayesian parameter estimation and m...
We consider an adaptive importance sampling approach to estimate the marginal likeli-hood, a quantit...
The efficiency of a marginal likelihood estimator where the product of the marginal posterior distri...
textabstractImportant choices for efficient and accurate evaluation of marginal likelihoods by means...
Abstract.—The marginal likelihood is commonly used for comparing different evolutionary models in Ba...
The α-stable distribution is very useful for modelling data with extreme values and skewed behaviour...
Abstract: In Bayesian inference, a Bayes factor is defined as the ratio of posterior odds versus pri...
We consider an adaptive importance sampling approach to estimating the marginal likelihood, a quanti...
Fitting parameters of interest in an elegant and efficient way via analysis of experimental data is ...
Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesi...
The key quantity needed for Bayesian hypothesis testing and model selection is the marginal likeliho...
Recent advances in Markov chain Monte Carlo (MCMC) extend the scope of Bayesian inference to models ...
Bayesian principle is conceptually simple and intuitively plausible to carry out but its numerical i...
Strategic choices for efficient and accurate evaluation of marginal likelihoods by means of Monte Ca...
We consider Bayesian inference by importance sampling when the likelihood is analytically intractabl...
The marginal likelihood, or model evidence, is a key quantity in Bayesian parameter estimation and m...
We consider an adaptive importance sampling approach to estimate the marginal likeli-hood, a quantit...
The efficiency of a marginal likelihood estimator where the product of the marginal posterior distri...
textabstractImportant choices for efficient and accurate evaluation of marginal likelihoods by means...
Abstract.—The marginal likelihood is commonly used for comparing different evolutionary models in Ba...
The α-stable distribution is very useful for modelling data with extreme values and skewed behaviour...
Abstract: In Bayesian inference, a Bayes factor is defined as the ratio of posterior odds versus pri...
We consider an adaptive importance sampling approach to estimating the marginal likelihood, a quanti...
Fitting parameters of interest in an elegant and efficient way via analysis of experimental data is ...
Methods for the systematic application of Monte Carlo integration with importance sampling to Bayesi...
The key quantity needed for Bayesian hypothesis testing and model selection is the marginal likeliho...