AbstractWe show how to find a small loop cutset in a Bayesian network. Finding such a loop cutset is the first step in the method of conditioning for inference. Our algorithm for finding a loop cutset, called MGA, finds a loop cutset which is guaranteed in the worst case to contain less than twice the number of variables contained in a minimum loop cutset. The algorithm is based on a reduction to the weighted vertex feedback set problem and a 2-approximation of the latter problem. The complexity of MGA is O(m + nlogn) where m and n are the number of edges and vertices respectively. A greedy algorithm, called GA, for the weighted vertex feedback set problem is also analyzed and bounds on its performance are given. We test MGA on randomly gen...
The paper presents a new sampling methodology for Bayesian networks that samples only a subset of va...
We survey some recent graph algorithms that are based on picking a vertex at random and declaring it...
For Gaussian graphical models with cycles, loopy belief propagation often performs reasonably well, ...
AbstractWe show how to find a small loop cutset in a Bayesian network. Finding such a loop cutset is...
We show how to find a small loop cutset in a Bayesian network. Finding such a loop cutset is the fir...
We show how to find a minimum loop cutset in a Bayesian network with high probability. Finding such ...
We show how to nd a minimum weight loop cutset in a Bayesian network with high probability. Finding ...
AbstractThe method of conditioning permits probabilistic inference in multiply connected belief netw...
In the Feedback Vertex Set problem, one is given an undirected graph G and an integer k, and one nee...
Given a graph on n vertices and an integer k, the feedback vertex set problem asks for the deletion ...
In the Feedback Vertex Set problem, one is given an undirected graph $G$ and an integer $k$, and one...
In the Feedback Vertex Set (FVS) problem, one is given an undirected graph G and an integer k, and o...
Let G be a directed graph. A vertex set F is called feedback vertex set (FVS) if its removal from G ...
AbstractWe present improved parameterized algorithms for the feedback vertex set problem on both unw...
In recent years feedback set problems have been the subject of growing interest. They have found app...
The paper presents a new sampling methodology for Bayesian networks that samples only a subset of va...
We survey some recent graph algorithms that are based on picking a vertex at random and declaring it...
For Gaussian graphical models with cycles, loopy belief propagation often performs reasonably well, ...
AbstractWe show how to find a small loop cutset in a Bayesian network. Finding such a loop cutset is...
We show how to find a small loop cutset in a Bayesian network. Finding such a loop cutset is the fir...
We show how to find a minimum loop cutset in a Bayesian network with high probability. Finding such ...
We show how to nd a minimum weight loop cutset in a Bayesian network with high probability. Finding ...
AbstractThe method of conditioning permits probabilistic inference in multiply connected belief netw...
In the Feedback Vertex Set problem, one is given an undirected graph G and an integer k, and one nee...
Given a graph on n vertices and an integer k, the feedback vertex set problem asks for the deletion ...
In the Feedback Vertex Set problem, one is given an undirected graph $G$ and an integer $k$, and one...
In the Feedback Vertex Set (FVS) problem, one is given an undirected graph G and an integer k, and o...
Let G be a directed graph. A vertex set F is called feedback vertex set (FVS) if its removal from G ...
AbstractWe present improved parameterized algorithms for the feedback vertex set problem on both unw...
In recent years feedback set problems have been the subject of growing interest. They have found app...
The paper presents a new sampling methodology for Bayesian networks that samples only a subset of va...
We survey some recent graph algorithms that are based on picking a vertex at random and declaring it...
For Gaussian graphical models with cycles, loopy belief propagation often performs reasonably well, ...