In this paper we introduce the concepts of Stolarsky and Gini divergence measures and establish a number of basic properties. Some comparison results in the same class or between different classes are also given
A general divergence measure for monotonic functions is introduced.\ud Its connections with the f−di...
During past years Dragomir contributed a lot of work providing different kind of bounds on the dista...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
In this paper we introduce the concepts of Stolarsky and Gini\ud divergence measures and establish a...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The aim of this review is to give different two-parametric generalizations of the following measures...
[[abstract]]The problem of choosing a proper divergence measure is an important one. We provide new ...
In this paper we introduce the concepts of p−logarithmic and α−power divergence measures and point ...
In this work, we introduce new series of divergence measures as a family of Csiszar’s functional div...
Some new inequalities for the well-known Jeffreys divergence measure in Information Theory are given
In this paper we introduce the concepts of p−logarithmic and\ud α−power divergence measures and poin...
Inequalities which connect information divergence with other measures of discrimination or distance ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
Information and Divergence measures deals with the study of problems concerning information processi...
Abstract: Inequalities are playing a fundamental role in the area of Information Theory and Statisti...
A general divergence measure for monotonic functions is introduced.\ud Its connections with the f−di...
During past years Dragomir contributed a lot of work providing different kind of bounds on the dista...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
In this paper we introduce the concepts of Stolarsky and Gini\ud divergence measures and establish a...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The aim of this review is to give different two-parametric generalizations of the following measures...
[[abstract]]The problem of choosing a proper divergence measure is an important one. We provide new ...
In this paper we introduce the concepts of p−logarithmic and α−power divergence measures and point ...
In this work, we introduce new series of divergence measures as a family of Csiszar’s functional div...
Some new inequalities for the well-known Jeffreys divergence measure in Information Theory are given
In this paper we introduce the concepts of p−logarithmic and\ud α−power divergence measures and poin...
Inequalities which connect information divergence with other measures of discrimination or distance ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
Information and Divergence measures deals with the study of problems concerning information processi...
Abstract: Inequalities are playing a fundamental role in the area of Information Theory and Statisti...
A general divergence measure for monotonic functions is introduced.\ud Its connections with the f−di...
During past years Dragomir contributed a lot of work providing different kind of bounds on the dista...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...