AbstractEven though existing algorithms for belief update in Bayesian networks (BNs) have exponential time and space complexity, belief update in many real-world BNs is feasible. However, in some cases the efficiency of belief update may be insufficient. In such cases minor improvements in efficiency may be important or even necessary to make a task tractable. This paper introduces two improvements to the message computation in Lazy propagation (LP): (1) we introduce myopic methods for sorting the operations involved in a variable elimination using arc-reversal and (2) extend LP with the any-space property. The performance impacts of the methods are assessed empirically
AbstractMultiply sectioned Bayesian networks (MSBNs) provide a coherent and flexible formalism for r...
This study presents an effective lazy schedule (LS) for the layered belief propagation (LS-LBP) algo...
Local belief propagation rules of the sort proposed by Pearl (1988) are guaranteed to converge to th...
AbstractEven though existing algorithms for belief update in Bayesian networks (BNs) have exponentia...
Belief update in a Bayesian network using Lazy Propagation (LP) proceeds by message passing over a j...
AbstractIn recent years, Bayesian networks with a mixture of continuous and discrete variables have ...
Abstract—Novel lazy Lauritzen-Spiegelhalter (LS), lazy Hugin and lazy Shafer-Shenoy (SS) algorithms ...
The efficiency of algorithms using secondary structures for probabilistic inference in Bayesian netw...
Probabilistic inference in Bayesian networks, and even reasoning within error bounds are known to be...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effe...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the eff...
Recent developments show that Multiply Sectioned Bayesian Networks (MSBNs) can be used for diagnosis...
AbstractEver since Kim and Pearl provided an exact message-passing algorithm for updating probabilit...
In this letter, we propose two modifications to belief propagation (BP) decoding algorithm. The modi...
We formulate the weighted b-matching objective function as a probability distribution function and p...
AbstractMultiply sectioned Bayesian networks (MSBNs) provide a coherent and flexible formalism for r...
This study presents an effective lazy schedule (LS) for the layered belief propagation (LS-LBP) algo...
Local belief propagation rules of the sort proposed by Pearl (1988) are guaranteed to converge to th...
AbstractEven though existing algorithms for belief update in Bayesian networks (BNs) have exponentia...
Belief update in a Bayesian network using Lazy Propagation (LP) proceeds by message passing over a j...
AbstractIn recent years, Bayesian networks with a mixture of continuous and discrete variables have ...
Abstract—Novel lazy Lauritzen-Spiegelhalter (LS), lazy Hugin and lazy Shafer-Shenoy (SS) algorithms ...
The efficiency of algorithms using secondary structures for probabilistic inference in Bayesian netw...
Probabilistic inference in Bayesian networks, and even reasoning within error bounds are known to be...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effe...
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the eff...
Recent developments show that Multiply Sectioned Bayesian Networks (MSBNs) can be used for diagnosis...
AbstractEver since Kim and Pearl provided an exact message-passing algorithm for updating probabilit...
In this letter, we propose two modifications to belief propagation (BP) decoding algorithm. The modi...
We formulate the weighted b-matching objective function as a probability distribution function and p...
AbstractMultiply sectioned Bayesian networks (MSBNs) provide a coherent and flexible formalism for r...
This study presents an effective lazy schedule (LS) for the layered belief propagation (LS-LBP) algo...
Local belief propagation rules of the sort proposed by Pearl (1988) are guaranteed to converge to th...