We study the metric on the n-dimensional unit hypercube. We introduce a class of new metrics for the space, which is information theoretically motivated and has close relation to Jensen-Shannon divergence. These metrics are obtained by discussing a function FDα(P,Q) with the parameter α. We come to the conclusion that the sufficient and necessary condition of the function being a metric is 0<α≤1/2. Finally, by computing basic examples of codons, we show some numerical comparison of the new metrics to the former metric
It is well known that both weightable quasi-metrics and the Hausdorff distance provide efficient too...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
We present an algorithm that, given a channel, determines if there is a distance for it such that th...
We propose a new class of metrics on sets, vectors, and functions that can be used in various stages...
Abstract. We present several new distance functions on hy-perspaces and investigate their properties...
In the field of information theory, statistics and other application areas, the information-theoreti...
We introduce a metric for probability distributions, which is bounded, information-theoretically mot...
In this short note we are engaged with sets we call generalized Sierpinski hypercubes. We give a det...
Jensen-Shannon divergence is a symmetrised, smoothed version of Küllback-Leibler. It has been shown ...
Several methods in data and shape analysis can be regarded as transformations between metric spaces....
A distance on a set is a comparative function. The smaller the distance between two elements of that...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
AbstractA metric d is h-embeddable if it can be isometrically embedded in some hypercube. Equivalent...
Measuring the similarity or distance between sets of points in a metric space is an important proble...
It is well known that both weightable quasi-metrics and the Hausdorff distance provide efficient too...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
We present an algorithm that, given a channel, determines if there is a distance for it such that th...
We propose a new class of metrics on sets, vectors, and functions that can be used in various stages...
Abstract. We present several new distance functions on hy-perspaces and investigate their properties...
In the field of information theory, statistics and other application areas, the information-theoreti...
We introduce a metric for probability distributions, which is bounded, information-theoretically mot...
In this short note we are engaged with sets we call generalized Sierpinski hypercubes. We give a det...
Jensen-Shannon divergence is a symmetrised, smoothed version of Küllback-Leibler. It has been shown ...
Several methods in data and shape analysis can be regarded as transformations between metric spaces....
A distance on a set is a comparative function. The smaller the distance between two elements of that...
summary:Standard properties of $\phi$-divergences of probability measures are widely applied in vari...
AbstractThe paper is devoted to metrization of probability spaces through the introduction of a quad...
AbstractA metric d is h-embeddable if it can be isometrically embedded in some hypercube. Equivalent...
Measuring the similarity or distance between sets of points in a metric space is an important proble...
It is well known that both weightable quasi-metrics and the Hausdorff distance provide efficient too...
Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central...
We present an algorithm that, given a channel, determines if there is a distance for it such that th...