It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p requires Θ(k / log k) samples, a number that grows near-linearly in the support size. In many applications H(p) can be replaced by the more general Rényi entropy of order α, Hα(p). We determine the number of samples needed to estimate Hα(p) for all α, showing that α < 1 requires a super-linear, roughly k1/α samples, noninteger α> 1 requires a near-linear k samples, but, perhaps surprisingly, integer α> 1 requires only Θ(k1−1/α) samples. In particular, estimating H2(p), which arises in security, DNA reconstruction, closeness testing, and other applications, requires only Θ( k) samples. The estimators achieving these bounds are simpl...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
Calculating the Shannon entropy for symbolic sequences has been widely considered in many fields. Fo...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We consider the problem of approximating the entropy of a discrete distribution under several models...
This paper studies the complexity of estimating Rényi divergences of discrete distributions: p obser...
Goldreich et al. (CRYPTO 1999) proved that the promise problem for estimating the Shannon entropy of...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision ent...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
Calculating the Shannon entropy for symbolic sequences has been widely considered in many fields. Fo...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We consider the problem of approximating the entropy of a discrete distribution under several models...
This paper studies the complexity of estimating Rényi divergences of discrete distributions: p obser...
Goldreich et al. (CRYPTO 1999) proved that the promise problem for estimating the Shannon entropy of...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision ent...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...