We consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random effects model with proper conjugate priors. A drift condition given in Meyn and Tweedie (1993, Chapter 15) is used to show that these Markov chains are geometrically ergodic. Showing that a Gibbs sampler is geometrically ergodic is the first step toward establishing central limit theorems, which can be used to approximate the error associated with Monte Carlo estimates of posterior quantities of interest. Thus, our results will be of practical interest to researchers using these Gibbs samplers for Bayesian data analysis.Bayesian model, central limit theorem drift condition Markov chain Monte Carlo rate of convergence variance components
Abstract. We introduce a new property of Markov chains, called variance bounding. We prove that, for...
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings...
The particle Gibbs sampler is a systematic way of using a particle filter within Markov chain Monte ...
AbstractWe consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random eff...
University of Minnesota Ph.D dissertation. July 2009. Major: Statistics. Advisor: Galin L. Jones. 1 ...
We consider two Bayesian hierarchical one-way random effects models and establish geomet-ric ergodic...
Exploration of the intractable posterior distributions associated with Bayesian versions of the gene...
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis–Hastings...
Bayesian analysis of data from the general linear mixed model is challenging because any nontrivial ...
We characterize the convergence of the Gibbs sampler which samples from the joint posterior distribu...
We establish quantitative bounds for rates of convergence and asymptotic variances for iterated cond...
We consider Bayesian error-in-variable (EIV) linear regression accounting for additional additive Ga...
AbstractConsider the quantile regression model Y=Xβ+σϵ where the components of ϵ are i.i.d. errors f...
AbstractThe geometrical convergence of the Gibbs sampler for simulating a probability distribution i...
We establish quantitative bounds for rates of convergence and asymptotic variances for iterated cond...
Abstract. We introduce a new property of Markov chains, called variance bounding. We prove that, for...
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings...
The particle Gibbs sampler is a systematic way of using a particle filter within Markov chain Monte ...
AbstractWe consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random eff...
University of Minnesota Ph.D dissertation. July 2009. Major: Statistics. Advisor: Galin L. Jones. 1 ...
We consider two Bayesian hierarchical one-way random effects models and establish geomet-ric ergodic...
Exploration of the intractable posterior distributions associated with Bayesian versions of the gene...
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis–Hastings...
Bayesian analysis of data from the general linear mixed model is challenging because any nontrivial ...
We characterize the convergence of the Gibbs sampler which samples from the joint posterior distribu...
We establish quantitative bounds for rates of convergence and asymptotic variances for iterated cond...
We consider Bayesian error-in-variable (EIV) linear regression accounting for additional additive Ga...
AbstractConsider the quantile regression model Y=Xβ+σϵ where the components of ϵ are i.i.d. errors f...
AbstractThe geometrical convergence of the Gibbs sampler for simulating a probability distribution i...
We establish quantitative bounds for rates of convergence and asymptotic variances for iterated cond...
Abstract. We introduce a new property of Markov chains, called variance bounding. We prove that, for...
We consider Markov chain Monte Carlo algorithms which combine Gibbs updates with Metropolis-Hastings...
The particle Gibbs sampler is a systematic way of using a particle filter within Markov chain Monte ...