Previous work on probabilistic topic models has either focused on models with relatively simple conjugate priors that support Gibbs sampling or models with non-conjugate priors that typically require variational inference. Gibbs sampling is more accurate than variational inference and better supports the construction of composite models. We present a method for Gibbs sampling in non-conjugate logistic normal topic models, and demonstrate it on a new class of topic models with arbitrary graph-structured priors that reflect the complex relationships commonly found in document collections, while retaining simple, robust inference
Topic models, such as latent Dirichlet allocation (LDA), have been an effective tool for the statist...
Topic models, such as latent Dirichlet allocation (LDA), can be useful tools for the statistical ana...
In this section we propose a hybrid Gibbs and variational inference for our differential topic model...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Logistic-normal topic models can effectively discover correlation structures among latent topics. Ho...
Inference is a central problem in probabilistic graphical models, and is often the main sub-step in ...
The proliferation of large electronic document archives requires new techniques for automatically an...
In topic modelling, various alternative priors have been developed, for instance asymmetric and symm...
The logistic normal distribution has recently been adapted via the transformation of multivariate Ga...
The logistic normal distribution has recently been adapted via the transformation of multivariate Ga...
Topic models, such as latent Dirichlet allocation (LDA), can be useful tools for the statistical ana...
The logistic normal distribution has recently been adapted via the transformation of multivariate Ga...
Topic models, such as latent Dirichlet allocation (LDA), have been an effective tool for the statist...
Topic models, such as latent Dirichlet allocation (LDA), can be useful tools for the statistical ana...
In this section we propose a hybrid Gibbs and variational inference for our differential topic model...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Previous work on probabilistic topic models has either focused on models with relatively simple conj...
Logistic-normal topic models can effectively discover correlation structures among latent topics. Ho...
Inference is a central problem in probabilistic graphical models, and is often the main sub-step in ...
The proliferation of large electronic document archives requires new techniques for automatically an...
In topic modelling, various alternative priors have been developed, for instance asymmetric and symm...
The logistic normal distribution has recently been adapted via the transformation of multivariate Ga...
The logistic normal distribution has recently been adapted via the transformation of multivariate Ga...
Topic models, such as latent Dirichlet allocation (LDA), can be useful tools for the statistical ana...
The logistic normal distribution has recently been adapted via the transformation of multivariate Ga...
Topic models, such as latent Dirichlet allocation (LDA), have been an effective tool for the statist...
Topic models, such as latent Dirichlet allocation (LDA), can be useful tools for the statistical ana...
In this section we propose a hybrid Gibbs and variational inference for our differential topic model...