AbstractIn statistical estimation problems measures between probability distributions play significant roles. Hellinger coefficient, Jeffreys distance, Chernoff coefficient, directed divergence, and its symmetrization J-divergence are examples of such measures. Here these and like measures are characterized through a composition law and the sum form they possess. The functional equations f(pr, qs) + f(ps, qr) = (r + s)f(p, q) + (p + q)f(r, s) and f(pr, qs) + f(ps, qr) = f(p, q)f(r, s) are instrumental in their deduction
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
We propose a new concept of codivergence, which quantifies the similarity between two probability me...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
AbstractIn statistical estimation problems measures between probability distributions play significa...
AbstractThe present work aims to study the stability of the following three functional equations: (i...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
Data science, information theory, probability theory, statistical learning and other related discipl...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
Several authors have developed characterization theorems for the directed divergence or information ...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The f-divergence evaluates the dissimilarity between two probability distributions defined in terms ...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) an...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
For arbitrary two probability measures on real d-space with given means and variances (covariance ma...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
We propose a new concept of codivergence, which quantifies the similarity between two probability me...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
AbstractIn statistical estimation problems measures between probability distributions play significa...
AbstractThe present work aims to study the stability of the following three functional equations: (i...
The Jensen-Shannon divergence is a renown bounded symmetrization of the unbounded Kullback-Leibler d...
Data science, information theory, probability theory, statistical learning and other related discipl...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
Several authors have developed characterization theorems for the directed divergence or information ...
Data science, information theory, probability theory, statistical learning, statistical signal proce...
The f-divergence evaluates the dissimilarity between two probability distributions defined in terms ...
Given two probability measures P and Q and an event E, we provide bounds on P(E) in terms of Q(E) an...
In this work, we introduce new information inequalities on new generalized f-divergence in terms of ...
For arbitrary two probability measures on real d-space with given means and variances (covariance ma...
summary:We establish a decomposition of the Jensen-Shannon divergence into a linear combination of a...
We propose a new concept of codivergence, which quantifies the similarity between two probability me...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...