A general method for defining informative priors on statistical models is presented and applied specifically to the space of classification and regression trees. A Bayesian approach to learning such models from data is taken, with the Metropolis- Hastings algorithm being used to approximately sample from the posterior. By only using proposal distributions closely tied to the prior, acceptance probabilities are easily computable via marginal likelihood ratios, whatever the prior used. Our approach is empirically tested by varying (i) the data, (ii) the prior and (iii) the proposal distribution. A comparison with related work is given
This article demonstrates the usefulness of Bayesian estimation with small samples. In Bayesian esti...
Right-stochastic matrices are used in the modelling of discrete-time Markov processes, with a proper...
One desirable property of machine learning algorithms is the ability to balance the number of p...
We present a general framework for defining priors on model structure and sampling from the posterio...
We propose a Bayesian framework for regression problems, which covers areas which are usually dealt ...
This paper presents and evaluates an approach to Bayesian model averaging where the models are Bayes...
In this paper we extend a methodology for Bayesian learning via MCMC, with the ability to grow arbit...
We present a general framework for defining priors on model structure and sampling from the posterio...
In this article we put forward a Bayesian approach for finding classification and regression tree (C...
Eliciting informative prior distributions for Bayesian inference can often be complex and challengin...
A major problem associated with Bayesian estimation is selecting the prior distribution. The more re...
We provide a review of prior distributions for objective Bayesian analysis. We start by examining so...
In contrast to a posterior analysis given a particular sampling model, posterior model probabilities...
It can be important in Bayesian analyses of complex models to construct informative prior distributi...
The reference priors, initiated in Bernardo (1979) and further developed in Berger and Bernardo (199...
This article demonstrates the usefulness of Bayesian estimation with small samples. In Bayesian esti...
Right-stochastic matrices are used in the modelling of discrete-time Markov processes, with a proper...
One desirable property of machine learning algorithms is the ability to balance the number of p...
We present a general framework for defining priors on model structure and sampling from the posterio...
We propose a Bayesian framework for regression problems, which covers areas which are usually dealt ...
This paper presents and evaluates an approach to Bayesian model averaging where the models are Bayes...
In this paper we extend a methodology for Bayesian learning via MCMC, with the ability to grow arbit...
We present a general framework for defining priors on model structure and sampling from the posterio...
In this article we put forward a Bayesian approach for finding classification and regression tree (C...
Eliciting informative prior distributions for Bayesian inference can often be complex and challengin...
A major problem associated with Bayesian estimation is selecting the prior distribution. The more re...
We provide a review of prior distributions for objective Bayesian analysis. We start by examining so...
In contrast to a posterior analysis given a particular sampling model, posterior model probabilities...
It can be important in Bayesian analyses of complex models to construct informative prior distributi...
The reference priors, initiated in Bernardo (1979) and further developed in Berger and Bernardo (199...
This article demonstrates the usefulness of Bayesian estimation with small samples. In Bayesian esti...
Right-stochastic matrices are used in the modelling of discrete-time Markov processes, with a proper...
One desirable property of machine learning algorithms is the ability to balance the number of p...