Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and smoothed form of Kullback-Leibler divergence or relative entropy, the Jensen-Shannon divergence (JSD), is of particular interest because of its sharing properties with families of other divergence measures and its interpretability in different domains including statistical physics, information theory and mathematical statistics. The uniqueness and versatility of this measure arise because of a number of attributes including generalization to any number of probability distributions and association of weights to the distributions. Furthermore, its entropic formulation allows its generalization in different statistical frameworks, such as, non-ext...
Quantifying the similarity between symbolic sequences is a traditional problem in information theory...
A comprehensive data base is analyzed to determine the Shannon information content of a protein sequ...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...
<div><p>Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrize...
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and sm...
We study statistical properties of the Jensen-Shannon divergence D, which quantifies the difference ...
International audienceBased on rescaling by some suitable sequence instead of the number of time uni...
We introduce Markov models for segmentation of symbolic sequences, extending a segmentation procedur...
This paper studies the genetic sequence of six species and describes a formulation of quantifying sc...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Abstract — Biological sequences from different species are called orthologs if they evolved from a s...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
The diffusion entropy analysis measures the scaling of the probability density function (pdf) of the...
Quantifying the similarity between symbolic sequences is a traditional problem in information theory...
A comprehensive data base is analyzed to determine the Shannon information content of a protein sequ...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...
<div><p>Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrize...
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and sm...
We study statistical properties of the Jensen-Shannon divergence D, which quantifies the difference ...
International audienceBased on rescaling by some suitable sequence instead of the number of time uni...
We introduce Markov models for segmentation of symbolic sequences, extending a segmentation procedur...
This paper studies the genetic sequence of six species and describes a formulation of quantifying sc...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Abstract — Biological sequences from different species are called orthologs if they evolved from a s...
The idea of using functionals of Information Theory, such as entropies or divergences, in statistica...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
The diffusion entropy analysis measures the scaling of the probability density function (pdf) of the...
Quantifying the similarity between symbolic sequences is a traditional problem in information theory...
A comprehensive data base is analyzed to determine the Shannon information content of a protein sequ...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...