In this paper we introduce the concepts of p−logarithmic and α−power divergence measures and point out a number of basic results
Finding the relationships between information measures and statistical constants leads to the applic...
In this paper, we obtain certain bounds for some dynamic information divergences measures viz. Renyi...
In this paper, we obtain certain bounds for some dynamic information divergences measures viz. Renyi...
In this paper we introduce the concepts of p−logarithmic and\ud α−power divergence measures and poin...
In this paper we introduce the concepts of Stolarsky and Gini divergence measures and establish a n...
In this paper we introduce the concepts of Stolarsky and Gini\ud divergence measures and establish a...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
Some new inequalities for the well-known Jeffreys divergence measure in Information Theory are given
[[abstract]]The problem of choosing a proper divergence measure is an important one. We provide new ...
In this work, we introduce new series of divergence measures as a family of Csiszar’s functional div...
The principle of optimality of dynamic programming is used to prove three major inequalities due to ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
ABSTRACT. The principle of optimality of dynamic programming is used to prove three major inequaliti...
A general divergence measure for monotonic functions is introduced.\ud Its connections with the f−di...
The principle of optimality of dynamic programming is used to prove three major inequalities due to ...
Finding the relationships between information measures and statistical constants leads to the applic...
In this paper, we obtain certain bounds for some dynamic information divergences measures viz. Renyi...
In this paper, we obtain certain bounds for some dynamic information divergences measures viz. Renyi...
In this paper we introduce the concepts of p−logarithmic and\ud α−power divergence measures and poin...
In this paper we introduce the concepts of Stolarsky and Gini divergence measures and establish a n...
In this paper we introduce the concepts of Stolarsky and Gini\ud divergence measures and establish a...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
Some new inequalities for the well-known Jeffreys divergence measure in Information Theory are given
[[abstract]]The problem of choosing a proper divergence measure is an important one. We provide new ...
In this work, we introduce new series of divergence measures as a family of Csiszar’s functional div...
The principle of optimality of dynamic programming is used to prove three major inequalities due to ...
AbstractA symmetric measure of information divergence is proposed. This measure belongs to the class...
ABSTRACT. The principle of optimality of dynamic programming is used to prove three major inequaliti...
A general divergence measure for monotonic functions is introduced.\ud Its connections with the f−di...
The principle of optimality of dynamic programming is used to prove three major inequalities due to ...
Finding the relationships between information measures and statistical constants leads to the applic...
In this paper, we obtain certain bounds for some dynamic information divergences measures viz. Renyi...
In this paper, we obtain certain bounds for some dynamic information divergences measures viz. Renyi...