AbstractConsider the quantile regression model Y=Xβ+σϵ where the components of ϵ are i.i.d. errors from the asymmetric Laplace distribution with rth quantile equal to 0, where r∈(0,1) is fixed. Kozumi and Kobayashi (2011) [9] introduced a Gibbs sampler that can be used to explore the intractable posterior density that results when the quantile regression likelihood is combined with the usual normal/inverse gamma prior for (β,σ). In this paper, the Markov chain underlying Kozumi and Kobayashi’s (2011) [9] algorithm is shown to converge at a geometric rate. No assumptions are made about the dimension of X, so the result still holds in the “large p, small n” case
AbstractThe geometrical convergence of the Gibbs sampler for simulating a probability distribution i...
To bake a Bayesian pi (posterior) I was taught that you needed an L (likelihood) and a p (prior) – ...
Quantile regression has recently received a great deal of attention in both theoretical and empirica...
University of Minnesota Ph.D dissertation. July 2009. Major: Statistics. Advisor: Galin L. Jones. 1 ...
We introduce a set of new Gibbs sampler for Bayesian analysis of quantile re-gression model. The new...
AbstractWe consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random eff...
We consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random effects mod...
Lp–quantiles generalise quantiles and expectiles to account for the whole distribution of the random...
The classical theory of linear models focuses on the conditional mean function, i.e. the function th...
Bayesian analysis of data from the general linear mixed model is challenging because any nontrivial ...
We consider Bayesian error-in-variable (EIV) linear regression accounting for additional additive Ga...
Quantile regression has received increasing attention both from a theoretical and from an empirical ...
Quantile regression, as a supplement to the mean regression, is often used when a comprehensive rel...
This paper is a study of the application of Bayesian Exponentially Tilted Empirical Likelihood to in...
We consider two Bayesian hierarchical one-way random effects models and establish geomet-ric ergodic...
AbstractThe geometrical convergence of the Gibbs sampler for simulating a probability distribution i...
To bake a Bayesian pi (posterior) I was taught that you needed an L (likelihood) and a p (prior) – ...
Quantile regression has recently received a great deal of attention in both theoretical and empirica...
University of Minnesota Ph.D dissertation. July 2009. Major: Statistics. Advisor: Galin L. Jones. 1 ...
We introduce a set of new Gibbs sampler for Bayesian analysis of quantile re-gression model. The new...
AbstractWe consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random eff...
We consider fixed scan Gibbs and block Gibbs samplers for a Bayesian hierarchical random effects mod...
Lp–quantiles generalise quantiles and expectiles to account for the whole distribution of the random...
The classical theory of linear models focuses on the conditional mean function, i.e. the function th...
Bayesian analysis of data from the general linear mixed model is challenging because any nontrivial ...
We consider Bayesian error-in-variable (EIV) linear regression accounting for additional additive Ga...
Quantile regression has received increasing attention both from a theoretical and from an empirical ...
Quantile regression, as a supplement to the mean regression, is often used when a comprehensive rel...
This paper is a study of the application of Bayesian Exponentially Tilted Empirical Likelihood to in...
We consider two Bayesian hierarchical one-way random effects models and establish geomet-ric ergodic...
AbstractThe geometrical convergence of the Gibbs sampler for simulating a probability distribution i...
To bake a Bayesian pi (posterior) I was taught that you needed an L (likelihood) and a p (prior) – ...
Quantile regression has recently received a great deal of attention in both theoretical and empirica...