There are many applications that benefit from computing the exact divergence between 2 discrete probability measures, including machine learning. Unfortunately, in the absence of any assumptions on the structure or independencies within these distributions, computing the divergence between them is an intractable problem in high dimensions. We show that we are able to compute a wide family of functionals and divergences, such as the alpha-beta divergence, between two decomposable models, i.e. chordal Markov networks, in time exponential to the treewidth of these models. The alpha-beta divergence is a family of divergences that include popular divergences such as the Kullback-Leibler divergence, the Hellinger distance, and the chi-squared div...
Contrastive divergence (CD) is a promising method of inference in high dimen-sional distributions wi...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
summary:We compute the expected value of the Kullback-Leibler divergence of various fundamental stat...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
We propose new nonparametric, consistent Renyi-alpha and Tsallis-alpha divergence estimators for con...
Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability...
In this paper we integrate two essential processes, discretization of continuous data and learning o...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
Approximating a divergence between two probability distributions from their sam-ples is a fundamenta...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Based on rescaling by some suitable sequence instead of the number of time units, the usual notion o...
The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used i...
We aim at the construction of a Hidden Markov Model (HMM) of assigned complexity (number of states o...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
Abstract. We review recent results about the maximal values of the Kullback-Leibler information dive...
Contrastive divergence (CD) is a promising method of inference in high dimen-sional distributions wi...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
summary:We compute the expected value of the Kullback-Leibler divergence of various fundamental stat...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
We propose new nonparametric, consistent Renyi-alpha and Tsallis-alpha divergence estimators for con...
Kullback–Leibler divergence KL(p,q) is the standard measure of error when we have a true probability...
In this paper we integrate two essential processes, discretization of continuous data and learning o...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
Approximating a divergence between two probability distributions from their sam-ples is a fundamenta...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Based on rescaling by some suitable sequence instead of the number of time units, the usual notion o...
The Kullback–Leibler (KL) divergence is a fundamental measure of information geometry that is used i...
We aim at the construction of a Hidden Markov Model (HMM) of assigned complexity (number of states o...
summary:In this paper we establish an upper and a lower bound for the $f$-divergence of two discrete...
Abstract. We review recent results about the maximal values of the Kullback-Leibler information dive...
Contrastive divergence (CD) is a promising method of inference in high dimen-sional distributions wi...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
summary:We compute the expected value of the Kullback-Leibler divergence of various fundamental stat...