It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p requires Θ(k/ log k) samples, a number that grows near-linearly in the support size. In many applications H(p) can be replaced by the more general Rényi entropy of order α, H α (p). We determine the number of samples needed to estimate H α (p) for all α, showing that α < 1 requires a super-linear, roughly k 1/α samples, noninteger α > 1 requires a nearlinear k samples, but, perhaps surprisingly, integer α > 1 requires only Θ(k 1−1/α ) samples. In particular, estimating H 2 (p), which arises in security, DNA reconstruction, closeness testing, and other applications, requires only Θ( √ k) samples. The estimators achieving these bounds...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
It is well known that to estimate the Shannon entropy for symbolic sequences accurately requires a l...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
Calculating the Shannon entropy for symbolic sequences has been widely considered in many fields. Fo...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We consider the problem of approximating the entropy of a discrete distribution under several models...
This paper studies the complexity of estimating Rényi divergences of discrete distributions: p obser...
Goldreich et al. (CRYPTO 1999) proved that the promise problem for estimating the Shannon entropy of...
We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision ent...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
It is well known that to estimate the Shannon entropy for symbolic sequences accurately requires a l...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
Calculating the Shannon entropy for symbolic sequences has been widely considered in many fields. Fo...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We consider the problem of approximating the entropy of a discrete distribution under several models...
This paper studies the complexity of estimating Rényi divergences of discrete distributions: p obser...
Goldreich et al. (CRYPTO 1999) proved that the promise problem for estimating the Shannon entropy of...
We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision ent...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
It is well known that to estimate the Shannon entropy for symbolic sequences accurately requires a l...
Sample Entropy is the most popular definition of entropy and is widely used as a measure of the regu...