© 2014, Institute of Mathematical Statistics. All rights received. We investigate mean field variational approximate Bayesian inference for models that use continuous distributions, Horseshoe, Negative-Exponential-Gamma and Generalized Double Pareto, for sparse signal shrinkage. Our principal finding is that the most natural, and simplest, mean field variational Bayes algorithm can perform quite poorly due to posterior dependence among auxiliary variables. More sophisticated algorithms, based on special functions, are shown to be superior. Continued fraction approximations via Lentz's Algorithm are developed to make the algorithms practical
The problem of estimating a high-dimensional sparse vector θ ∈ ℝ n from an observation in i.i.d. Gau...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
In various applications, we deal with high-dimensional positive-valued data that often exhibits spar...
We develop strategies for mean field variational Bayes approximate inference for Bayesian hierarchic...
We study a mean-field spike and slab variational Bayes (VB) approximation to Bayesian model selectio...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
This work considers variational Bayesian inference as an inexpensive and scalable alternative to a f...
International audienceIn this paper we address the problem of sparse representation (SR) within a Ba...
In all areas of human knowledge, datasets are increasing in both size and complexity, creating the n...
We study a mean-field spike and slab variational Bayes (VB) approximation to Bayesian model selectio...
We consider a high-dimensional sparse normal means model where the goal is to estimate the mean vect...
The problem of estimating a high-dimensional sparse vector $\boldsymbol{\theta} \in \mathbb{R}^n$ fr...
Variational approximations are approximate inference techniques for complex statisticalmodels provid...
Sparsity is a fundamental concept of modern statistics, and often the only general principle availab...
University of Technology Sydney. Faculty of Science.The focus of this thesis is on the development a...
The problem of estimating a high-dimensional sparse vector θ ∈ ℝ n from an observation in i.i.d. Gau...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
In various applications, we deal with high-dimensional positive-valued data that often exhibits spar...
We develop strategies for mean field variational Bayes approximate inference for Bayesian hierarchic...
We study a mean-field spike and slab variational Bayes (VB) approximation to Bayesian model selectio...
Variational Inference (VI) has become a popular technique to approximate difficult-to-compute poster...
This work considers variational Bayesian inference as an inexpensive and scalable alternative to a f...
International audienceIn this paper we address the problem of sparse representation (SR) within a Ba...
In all areas of human knowledge, datasets are increasing in both size and complexity, creating the n...
We study a mean-field spike and slab variational Bayes (VB) approximation to Bayesian model selectio...
We consider a high-dimensional sparse normal means model where the goal is to estimate the mean vect...
The problem of estimating a high-dimensional sparse vector $\boldsymbol{\theta} \in \mathbb{R}^n$ fr...
Variational approximations are approximate inference techniques for complex statisticalmodels provid...
Sparsity is a fundamental concept of modern statistics, and often the only general principle availab...
University of Technology Sydney. Faculty of Science.The focus of this thesis is on the development a...
The problem of estimating a high-dimensional sparse vector θ ∈ ℝ n from an observation in i.i.d. Gau...
Variational inference (VI) or Variational Bayes (VB) is a popular alternative to MCMC, which doesn\u...
In various applications, we deal with high-dimensional positive-valued data that often exhibits spar...