Jensen–Shannon divergence is a well known multi-purpose measure of dissimilarity between probability distributions. It has been proven that the square root of this quantity is a true metric in the sense that, in addition to the basic properties of a distance, it also satisfies the triangle inequality. In this work we extend this last result to prove that in fact it is possible to derive a monoparametric family of metrics from the classical Jensen–Shannon divergence. Motivated by our results, an application into the field of symbolic sequences segmentation is explored. Additionally, we analyze the possibility to extend this result into the quantum realm.Fil: Osán, Tristán Martín. Consejo Nacional de Investigaciones Científicas y Técnicas. Ce...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergenc...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
In a recent paper, the generalization of the Jensen-Shannon divergence in the context of quantum the...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Abstract. The Fisher informational metric is unique in some sense (it is the only Markovian monotone...
We describe a framework to build distances by measuring the tightness of inequalities and introduce ...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
AbstractClassical information geometry has emerged from the study of geometrical aspect of the stati...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
Abstract Motivated by the method of interpolating inequalities that makes use of the improved Jensen...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Metrics and distances in probability spaces have shown to be useful tools for physical purposes. Her...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...
Jensen-Shannon divergence (JD) is a symmetrized and smoothed version of the most important divergenc...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
In a recent paper, the generalization of the Jensen-Shannon divergence in the context of quantum the...
Divergence measures are widely used in various applications of pattern recognition, signal processin...
Abstract. The Fisher informational metric is unique in some sense (it is the only Markovian monotone...
We describe a framework to build distances by measuring the tightness of inequalities and introduce ...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
AbstractClassical information geometry has emerged from the study of geometrical aspect of the stati...
summary:We propose a simple method of construction of new families of $\phi$%-divergences. This meth...
Abstract Motivated by the method of interpolating inequalities that makes use of the improved Jensen...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Metrics and distances in probability spaces have shown to be useful tools for physical purposes. Her...
There are many applications that benefit from computing the exact divergence between 2 discrete prob...
Inferring and comparing complex, multivariable probability density functions is fundamental to probl...
International audienceIn this paper, we generalize the results obtained with the Kullback distance (...