We consider the problem of computing the Kullback-Leibler distance, also called the relative entropy, between a probabilistic context-free grammar and a probabilistic finite automaton. We show that there is a closed-form (analytical) solution for one part of the Kullback-Leibler distance, viz. the cross-entropy. We discuss several applications of the result to the problem of distributional approximation of probabilistic context-free grammars by means of probabilistic finite automata
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
Probabilistic bisimilarity, due to Segala and Lynch, is an equivalence relation that captures which ...
The information theoretical concept of the entropy (channel capacity) of context-free languages and ...
We consider the problem of computing the Kullback-Leibler distance, also called the relative entropy...
We consider the problem of computing the Kullback-Leibler distance, also called the relative entropy...
Kullback-Leibler divergence is a natural distance measure between two probabilistic finite-state aut...
Several mathematical distances between probabilistic languages have been investigated in the literat...
AbstractSeveral mathematical distances between probabilistic languages have been investigated in the...
Several mathematical distances between probabilistic languages have been inves-tigated in the litera...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
Probabilistic bisimilarity, due to Segala and Lynch, is an equivalence relation that captures which ...
The information theoretical concept of the entropy (channel capacity) of context-free languages and ...
We consider the problem of computing the Kullback-Leibler distance, also called the relative entropy...
We consider the problem of computing the Kullback-Leibler distance, also called the relative entropy...
Kullback-Leibler divergence is a natural distance measure between two probabilistic finite-state aut...
Several mathematical distances between probabilistic languages have been investigated in the literat...
AbstractSeveral mathematical distances between probabilistic languages have been investigated in the...
Several mathematical distances between probabilistic languages have been inves-tigated in the litera...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
In this paper, we consider probabilistic context-free grammars, a class of generative devices that h...
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
We investigate the problem of training probabilistic context-free grammars on the basis of a distrib...
Probabilistic bisimilarity, due to Segala and Lynch, is an equivalence relation that captures which ...
The information theoretical concept of the entropy (channel capacity) of context-free languages and ...