Exact inference in densely connected Bayesian networks is computationally intractable, and so there is considerable interest in developing effective approximation schemes. One approach which has been adopted is to bound the log likelihood using a mean-field approximating distribution. While this leads to a tractable algorithm, the mean field distribution is assumed to be factorial and hence unimodal. In this paper we demonstrate the feasibility of using a richer class of approximating distributions based on mixtures of mean field distributions. We derive an efficient algorithm for updating the mixture parameters and apply it to the problem of learning in sigmoid belief networks. Our results demonstrate a systematic improvement over simple m...
AbstractApproximating the inference probability Pr[X = x | E = e] in any sense, even for a single ev...
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence nod...
Bayesian Belief Networks are graph-based representations of probability distributions. In the last d...
Exact inference in densely connected Bayesian networks is computation-ally intractable, and so there...
Bayesian belief networks can represent the complicated probabilistic processes that form natural sen...
Exact inference in large, densely connected probabilistic networks is computa-tionally intractable, ...
Computation of marginal probabilities in Bayesian Belief Networks is central to many probabilistic r...
The chief aim of this paper is to propose mean-field approximations for a broad class of Belief ne...
Sigmoid type belief networks, a class of probabilistic neural networks, provide a natural framework ...
We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechani...
A new approach for learning Bayesian belief networks from raw data is presented. The approach is bas...
A new approach for learning Bayesian belief networks from raw data is presented. The approach is bas...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
A new approach for learning Bayesian belief networks from raw data is presented. The approach is bas...
AbstractApproximating the inference probability Pr[X = x | E = e] in any sense, even for a single ev...
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence nod...
Bayesian Belief Networks are graph-based representations of probability distributions. In the last d...
Exact inference in densely connected Bayesian networks is computation-ally intractable, and so there...
Bayesian belief networks can represent the complicated probabilistic processes that form natural sen...
Exact inference in large, densely connected probabilistic networks is computa-tionally intractable, ...
Computation of marginal probabilities in Bayesian Belief Networks is central to many probabilistic r...
The chief aim of this paper is to propose mean-field approximations for a broad class of Belief ne...
Sigmoid type belief networks, a class of probabilistic neural networks, provide a natural framework ...
We develop a mean field theory for sigmoid belief networks based on ideas from statistical mechani...
A new approach for learning Bayesian belief networks from raw data is presented. The approach is bas...
A new approach for learning Bayesian belief networks from raw data is presented. The approach is bas...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
Highly expressive directed latent variable mod-els, such as sigmoid belief networks, are diffi-cult ...
A new approach for learning Bayesian belief networks from raw data is presented. The approach is bas...
AbstractApproximating the inference probability Pr[X = x | E = e] in any sense, even for a single ev...
Approximating the inference probability Pr[X = xjE = e] in any sense, even for a single evidence nod...
Bayesian Belief Networks are graph-based representations of probability distributions. In the last d...