This thesis addresses several issues appearing in Bayesian statistics. Firstly, computations for approximating Bayesian posteriors are often performed using Markov chain Monte Carlo (MCMC) methods. However, standard MCMC algorithms tend to perform poorly when the posterior distribution has multiple modes. Secondly, performance of MCMC methods heavily depends on the choice of hyperparameters. It is therefore beneficial to design adaptive methods which update the hyperparameters on the fly, as the algorithm runs. Another topic explored in this thesis is associated with potential misspecification of the considered Bayesian model. It has been well-understood that standard Bayesian inference, when applied to misspecified models, leads to mislead...
This thesis focuses on sources of error in modern Bayesian analysis and machine learning in the ``bi...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
While modern machine learning and deep learning seem to dominate the areas where scalability and mod...
Bayesian methods provide the means for studying probabilistic models of linear as well as non-linear...
This thesis explores how a Bayesian should update their beliefs in the knowledge that any model ava...
We describe adaptive Markov chain Monte Carlo (MCMC) methods for sampling posterior distributions ar...
Recent advances in stochastic gradient varia-tional inference have made it possible to perform varia...
Bayesian approach for inference has become one of the central interests in statistical inference, du...
Computational Bayesian statistics builds approximations to the posterior distribution either bysampl...
We investigate Bayesian alternatives to classical Monte Carlo methods for evaluating integrals. Baye...
For half a century computational scientists have been numerically simulating complex systems. Uncert...
The Markov Chain Monte-Carlo (MCMC) born in early 1950s has recently aroused great interest among s...
a b s t r a c t We present an overview of Markov chain Monte Carlo, a sampling method for model infe...
We introduce a framework for efficient Markov chain Monte Carlo algorithms targeting discrete-valued...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
This thesis focuses on sources of error in modern Bayesian analysis and machine learning in the ``bi...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
While modern machine learning and deep learning seem to dominate the areas where scalability and mod...
Bayesian methods provide the means for studying probabilistic models of linear as well as non-linear...
This thesis explores how a Bayesian should update their beliefs in the knowledge that any model ava...
We describe adaptive Markov chain Monte Carlo (MCMC) methods for sampling posterior distributions ar...
Recent advances in stochastic gradient varia-tional inference have made it possible to perform varia...
Bayesian approach for inference has become one of the central interests in statistical inference, du...
Computational Bayesian statistics builds approximations to the posterior distribution either bysampl...
We investigate Bayesian alternatives to classical Monte Carlo methods for evaluating integrals. Baye...
For half a century computational scientists have been numerically simulating complex systems. Uncert...
The Markov Chain Monte-Carlo (MCMC) born in early 1950s has recently aroused great interest among s...
a b s t r a c t We present an overview of Markov chain Monte Carlo, a sampling method for model infe...
We introduce a framework for efficient Markov chain Monte Carlo algorithms targeting discrete-valued...
Traditional algorithms for Bayesian posterior inference require processing the entire dataset in eac...
This thesis focuses on sources of error in modern Bayesian analysis and machine learning in the ``bi...
The Markov Chain Monte Carlo (MCMC) technique provides a means to generate a random sequence of mode...
While modern machine learning and deep learning seem to dominate the areas where scalability and mod...