abstract: Information divergence functions, such as the Kullback-Leibler divergence or the Hellinger distance, play a critical role in statistical signal processing and information theory; however estimating them can be challenge. Most often, parametric assumptions are made about the two distributions to estimate the divergence of interest. In cases where no parametric model fits the data, non-parametric density estimation is used. In statistical signal processing applications, Gaussianity is usually assumed since closed-form expressions for common divergence measures have been derived for this family of distributions. Parametric assumptions are preferred when it is known that the data follows the model, however this is rarely the case in r...
© 2018 by the authors. The Kullback-Leibler (KL) divergence is a fundamental measure of information ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Information-theoretic measures such as Shannon entropy, mutual information, and the Kullback-Leibler...
Information-theoretic measures such as Shannon entropy, mutual information, and the Kullback-Leibler...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
This thesis documents three different contributions in statistical learning theory. They were develo...
This note provides a bibliography of investigations based on or related to divergence measures for t...
This book presents new and original research in Statistical Information Theory, based on minimum div...
This note provides a bibliography of investigations based on or related to divergence measures for t...
Data science, information theory, probability theory, statistical learning and other related discipl...
Information and Divergence measures deals with the study of problems concerning information processi...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
Several authors have developed characterization theorems for the directed divergence or information ...
abstract: Divergence functions are both highly useful and fundamental to many areas in information t...
© 2018 by the authors. The Kullback-Leibler (KL) divergence is a fundamental measure of information ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Information-theoretic measures such as Shannon entropy, mutual information, and the Kullback-Leibler...
Information-theoretic measures such as Shannon entropy, mutual information, and the Kullback-Leibler...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
This thesis documents three different contributions in statistical learning theory. They were develo...
This note provides a bibliography of investigations based on or related to divergence measures for t...
This book presents new and original research in Statistical Information Theory, based on minimum div...
This note provides a bibliography of investigations based on or related to divergence measures for t...
Data science, information theory, probability theory, statistical learning and other related discipl...
Information and Divergence measures deals with the study of problems concerning information processi...
Recent work has focused on the problem of nonparametric estimation of information divergence functio...
Several authors have developed characterization theorems for the directed divergence or information ...
abstract: Divergence functions are both highly useful and fundamental to many areas in information t...
© 2018 by the authors. The Kullback-Leibler (KL) divergence is a fundamental measure of information ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...