Quantifying the similarity between symbolic sequences is a traditional problem in information theory which requires comparing the frequencies of symbols in different sequences. In numerous modern applications, ranging from DNA over music to texts, the distribution of symbol frequencies is characterized by heavy-tailed distributions (e.g., Zipf's law). The large number of low-frequency symbols in these distributions poses major difficulties to the estimation of the similarity between sequences; e.g., they hinder an accurate finite-size estimation of entropies. Here, we show analytically how the systematic (bias) and statistical (fluctuations) errors in these estimations depend on the sample size N and on the exponent. of the heavy-tailed dis...
Natural language is a remarkable example of a complex dynamical system which combines variation and ...
When an i.i.d. sequence of letters is cut into words according to i.i.d. renewal times, an i.i.d. se...
The choice associated with words is a fundamental property of natural languages. It lies at the hear...
Quantifying the similarity between symbolic sequences is a traditional problem in information theory...
We show how generalized Gibbs-Shannon entropies can provide new insights on the statistical properti...
Recently, it was demonstrated that generalized entropies of order α offer novel and important opport...
This work is a discussion of algorithms for estimating the Shannon entropy h of finite symbol sequen...
Recently, it was demonstrated that generalized entropies of order α offer novel and important opport...
International audienceZipf’s law has intrigued people for a long time. This distribution models a ce...
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and sm...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
<div><p>Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrize...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Natural language is a remarkable example of a complex dynamical system which combines variation and ...
When an i.i.d. sequence of letters is cut into words according to i.i.d. renewal times, an i.i.d. se...
The choice associated with words is a fundamental property of natural languages. It lies at the hear...
Quantifying the similarity between symbolic sequences is a traditional problem in information theory...
We show how generalized Gibbs-Shannon entropies can provide new insights on the statistical properti...
Recently, it was demonstrated that generalized entropies of order α offer novel and important opport...
This work is a discussion of algorithms for estimating the Shannon entropy h of finite symbol sequen...
Recently, it was demonstrated that generalized entropies of order α offer novel and important opport...
International audienceZipf’s law has intrigued people for a long time. This distribution models a ce...
Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrized and sm...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
<div><p>Entropy based measures have been frequently used in symbolic sequence analysis. A symmetrize...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
A new approach to estimate the Shannon entropy of a long-range correlated sequence is proposed. The ...
Natural language is a remarkable example of a complex dynamical system which combines variation and ...
When an i.i.d. sequence of letters is cut into words according to i.i.d. renewal times, an i.i.d. se...
The choice associated with words is a fundamental property of natural languages. It lies at the hear...