The aim of this review is to give different two-parametric generalizations of the following measures: directed divergence (Kullback and Leibler, 1951), Jensen difference divergence (Burbea and Rao 1982 a,b; Rao, 1982) and Jeffreys invariant divergence (Jeffreys, 1946). These generalizations are put in the unified expression and their properties are studied. The applications of generalized information and divergence measures to comparison of experiments and the connections with Fisher information measure are also given
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Several authors have developed characterization theorems for the directed divergence or information ...
Abstract Divergence measures are useful for comparing two probability distributions. Depending on th...
Finding the relationships between information measures and statistical constants leads to the applic...
[[abstract]]The problem of choosing a proper divergence measure is an important one. We provide new ...
Some new inequalities for the well-known Jeffreys divergence measure in Information Theory are given
In this paper we introduce the concepts of Stolarsky and Gini divergence measures and establish a n...
In this paper we introduce the concepts of Stolarsky and Gini\ud divergence measures and establish a...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
AbstractThe fundamentals of information theory and also their applications to testing statistical hy...
Inequalities which connect information divergence with other measures of discrimination or distance ...
AbstractThe fundamentals of information theory and also their applications to testing statistical hy...
Abstract. In this paper we shall consider one parametric generalization of some non-symmetric diverg...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Several authors have developed characterization theorems for the directed divergence or information ...
Abstract Divergence measures are useful for comparing two probability distributions. Depending on th...
Finding the relationships between information measures and statistical constants leads to the applic...
[[abstract]]The problem of choosing a proper divergence measure is an important one. We provide new ...
Some new inequalities for the well-known Jeffreys divergence measure in Information Theory are given
In this paper we introduce the concepts of Stolarsky and Gini divergence measures and establish a n...
In this paper we introduce the concepts of Stolarsky and Gini\ud divergence measures and establish a...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
Perturbed generalised Taylor-like series are utilised to obtain approximations and bounds for diverg...
AbstractThe fundamentals of information theory and also their applications to testing statistical hy...
Inequalities which connect information divergence with other measures of discrimination or distance ...
AbstractThe fundamentals of information theory and also their applications to testing statistical hy...
Abstract. In this paper we shall consider one parametric generalization of some non-symmetric diverg...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
Several authors have developed characterization theorems for the directed divergence or information ...