We provide a new result that links two crucial entropy notions: Shannon Entropy H1 and collision entropy H2. Our formula gives the worst possible amount of collision entropy in a probability distribution, when its Shannon Entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were previously known as qual-itative statements or one-sided bounds. In particular, we precisely calculate the number of bits that can be extracted from a Shannon Entropy source, and calculate how far from the uniform distribution is a distribution with the given amount of Shannon Entropy. To illustrate our results we provide clear numerical examples. In the...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
The notion of smooth entropy allows a unifying, generalized formulation of privacy amplification and...
The notion of smooth entropy allows a unifying, generalized formulation of privacy amplification and...
International audienceIn many areas of computer science, it is of primary importance to assess the r...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
Suppose that an infinite sequence is produced by independent trials of a random variable with a fixe...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
We study convexity properties of the Rényi entropy as function of $\alpha >0$ on finite alphabets. W...
We study the two party problem of randomly selecting a common string among all the strings of length...
Abstract: "Suppose that an infinite sequence is produced by independent trials of a random variable ...
Even if a probability distribution is properly normalizable, its associated Shannon (or von Neumann)...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
The notion of smooth entropy allows a unifying, generalized formulation of privacy amplification and...
The notion of smooth entropy allows a unifying, generalized formulation of privacy amplification and...
International audienceIn many areas of computer science, it is of primary importance to assess the r...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
Suppose that an infinite sequence is produced by independent trials of a random variable with a fixe...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
We study convexity properties of the Rényi entropy as function of $\alpha >0$ on finite alphabets. W...
We study the two party problem of randomly selecting a common string among all the strings of length...
Abstract: "Suppose that an infinite sequence is produced by independent trials of a random variable ...
Even if a probability distribution is properly normalizable, its associated Shannon (or von Neumann)...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...