AbstractEven though existing algorithms for belief update in Bayesian networks (BNs) have exponential time and space complexity, belief update in many real-world BNs is feasible. However, in some cases the efficiency of belief update may be insufficient. In such cases minor improvements in efficiency may be important or even necessary to make a task tractable. This paper introduces two improvements to the message computation in Lazy propagation (LP): (1) we introduce myopic methods for sorting the operations involved in a variable elimination using arc-reversal and (2) extend LP with the any-space property. The performance impacts of the methods are assessed empirically
Belief propagation and its variants are popular methods for approximate inference, but their running...
Local belief propagation rules of the sort proposed by Pearl(1988) are guaranteed to converge to the...
AbstractBelief updating in Bayes nets, a well-known computationally hard problem, has recently been ...
AbstractEven though existing algorithms for belief update in Bayesian networks (BNs) have exponentia...
Belief update in a Bayesian network using Lazy Propagation (LP) proceeds by message passing over a j...
AbstractIn recent years, Bayesian networks with a mixture of continuous and discrete variables have ...
Probabilistic inference in Bayesian networks, and even reasoning within error bounds are known to be...
Abstract—Novel lazy Lauritzen-Spiegelhalter (LS), lazy Hugin and lazy Shafer-Shenoy (SS) algorithms ...
The efficiency of algorithms using secondary structures for probabilistic inference in Bayesian netw...
AbstractEver since Kim and Pearl provided an exact message-passing algorithm for updating probabilit...
AbstractIn this paper we present a junction tree based inference architecture exploiting the structu...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effe...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the eff...
Computation of marginal probabilities in Bayesian Belief Networks is central to many probabilistic r...
Recent developments show that Multiply Sectioned Bayesian Networks (MSBNs) can be used for diagnosis...
Belief propagation and its variants are popular methods for approximate inference, but their running...
Local belief propagation rules of the sort proposed by Pearl(1988) are guaranteed to converge to the...
AbstractBelief updating in Bayes nets, a well-known computationally hard problem, has recently been ...
AbstractEven though existing algorithms for belief update in Bayesian networks (BNs) have exponentia...
Belief update in a Bayesian network using Lazy Propagation (LP) proceeds by message passing over a j...
AbstractIn recent years, Bayesian networks with a mixture of continuous and discrete variables have ...
Probabilistic inference in Bayesian networks, and even reasoning within error bounds are known to be...
Abstract—Novel lazy Lauritzen-Spiegelhalter (LS), lazy Hugin and lazy Shafer-Shenoy (SS) algorithms ...
The efficiency of algorithms using secondary structures for probabilistic inference in Bayesian netw...
AbstractEver since Kim and Pearl provided an exact message-passing algorithm for updating probabilit...
AbstractIn this paper we present a junction tree based inference architecture exploiting the structu...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effe...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the eff...
Computation of marginal probabilities in Bayesian Belief Networks is central to many probabilistic r...
Recent developments show that Multiply Sectioned Bayesian Networks (MSBNs) can be used for diagnosis...
Belief propagation and its variants are popular methods for approximate inference, but their running...
Local belief propagation rules of the sort proposed by Pearl(1988) are guaranteed to converge to the...
AbstractBelief updating in Bayes nets, a well-known computationally hard problem, has recently been ...