The biggest limitation of probabilistic graphical models is the complexity of inference, which is often intractable. An appealing alternative is to use tractable probabilistic models, such as arithmetic circuits (ACs) and sum-product networks (SPNs), in which marginal and conditional queries can be answered efficiently. In this paper, we present the first discriminative structure learning algorithm for ACs, DACLearn (Discriminative AC Learner), which optimizes conditional log-likelihood. Based on our experiments, DACLearn learns models that are more accurate and compact than other tractable generative and discriminative baselines
textMany real-world problems involve data that both have complex structures and uncertainty. Statist...
Sum-product networks (SPNs) are expressive probabilistic models with a rich set of exact and efficie...
The need for feasible inference in Probabilistic Graphical Models (PGMs) has lead to tractable model...
Probabilistic graphical models have been successfully applied to a wide variety of fields such as co...
Graphical models are usually learned without re-gard to the cost of doing inference with them. As a ...
Discriminatively-trained probabilistic models often outperform their generative counterparts on chal...
The probabilistic sentential decision diagram (PSDD) was recently introduced as a tractable represen...
Markov networks are an effective way to rep-resent complex probability distributions. How-ever, lear...
Markov networks are an effective way to rep-resent complex probability distributions. How-ever, lear...
Probabilistic circuits (PCs) are a promising avenue for probabilistic modeling, as they permit a wid...
Probabilistic generating circuits (PGCs) are economical representations of multivariate probability ...
This paper proposes a new classification model called logistic circuits. On MNIST and Fashion datase...
Sum-product networks (SPNs) are a recently developed class of deep probabilistic models where infere...
Tractable learning aims to learn probabilistic models where inference is guaran-teed to be efficient...
Tractable learning aims to learn probabilistic models where inference is guaran- teed to be efficien...
textMany real-world problems involve data that both have complex structures and uncertainty. Statist...
Sum-product networks (SPNs) are expressive probabilistic models with a rich set of exact and efficie...
The need for feasible inference in Probabilistic Graphical Models (PGMs) has lead to tractable model...
Probabilistic graphical models have been successfully applied to a wide variety of fields such as co...
Graphical models are usually learned without re-gard to the cost of doing inference with them. As a ...
Discriminatively-trained probabilistic models often outperform their generative counterparts on chal...
The probabilistic sentential decision diagram (PSDD) was recently introduced as a tractable represen...
Markov networks are an effective way to rep-resent complex probability distributions. How-ever, lear...
Markov networks are an effective way to rep-resent complex probability distributions. How-ever, lear...
Probabilistic circuits (PCs) are a promising avenue for probabilistic modeling, as they permit a wid...
Probabilistic generating circuits (PGCs) are economical representations of multivariate probability ...
This paper proposes a new classification model called logistic circuits. On MNIST and Fashion datase...
Sum-product networks (SPNs) are a recently developed class of deep probabilistic models where infere...
Tractable learning aims to learn probabilistic models where inference is guaran-teed to be efficient...
Tractable learning aims to learn probabilistic models where inference is guaran- teed to be efficien...
textMany real-world problems involve data that both have complex structures and uncertainty. Statist...
Sum-product networks (SPNs) are expressive probabilistic models with a rich set of exact and efficie...
The need for feasible inference in Probabilistic Graphical Models (PGMs) has lead to tractable model...