We introduce a metric for probability distributions, which is bounded, information-theoretically motivated, and has a natural Bayesian interpretation. The square root of the well-known chi(2) distance is an asymptotic approximation to it. Moreover, it is a close relative of the capacitory discrimination and Jensen-Shannon divergence.</p
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
In the field of information theory, statistics and other application areas, the information-theoreti...
The local sensitivity analysis is recognized for its computational simplicity, and potential use in ...
We introduce a metric for probability distributions, which is bounded, information-theoretically mot...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
The paper is devoted to metrization of probability spaces through the introduction of a quadratic di...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
We introduce two new information theoretic measures of distances among probability distributions and...
summary:Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic different...
In this paper we discuss the construction of differential metrics in probability spaces through entr...
An acknowledged interpretation of possibility distributions in quantitative possibility theory is in...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Abstract — With increasing use of digital control it is natural to view control inputs and outputs a...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
In the field of information theory, statistics and other application areas, the information-theoreti...
The local sensitivity analysis is recognized for its computational simplicity, and potential use in ...
We introduce a metric for probability distributions, which is bounded, information-theoretically mot...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
The paper is devoted to metrization of probability spaces through the introduction of a quadratic di...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
We introduce two new information theoretic measures of distances among probability distributions and...
summary:Burbea and Rao (1982a, 1982b) gave some general methods for constructing quadratic different...
In this paper we discuss the construction of differential metrics in probability spaces through entr...
An acknowledged interpretation of possibility distributions in quantitative possibility theory is in...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Abstract — With increasing use of digital control it is natural to view control inputs and outputs a...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
In the field of information theory, statistics and other application areas, the information-theoreti...
The local sensitivity analysis is recognized for its computational simplicity, and potential use in ...