Consider the problem of estimating the Shannon entropy of a distribution over k elements from n independent samples. We obtain the minimax mean- square error within universal multiplicative constant factors if n exceeds a constant factor of k/log(k); otherwise there exists no consistent estimator. This refines the recent result of Valiant and Valiant (2011) that the mini- mal sample size for consistent entropy estimation scales. The apparatus of best polynomial approximation plays a key role in both the construction of optimal estimators and, via a duality argument, the minimax lower bound
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
Many statistical procedures, including goodness-of-fit tests and methods for independent component a...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
Recent advances in genetics, computer vision, and text mining are accompanied by analyzing data comi...
Calculating the Shannon entropy for symbolic sequences has been widely considered in many fields. Fo...
We propose a general framework for the construction and analysis of minimax estimators for a wide cl...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
Many statistical procedures, including goodness-of-fit tests and methods for independent component a...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
We revisit the problem of estimating entropy of discrete distributions from independent samples, stu...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
Recent advances in genetics, computer vision, and text mining are accompanied by analyzing data comi...
Calculating the Shannon entropy for symbolic sequences has been widely considered in many fields. Fo...
We propose a general framework for the construction and analysis of minimax estimators for a wide cl...
We describe an algorithm to efficiently compute maximum entropy densities, i.e. densities maximizing...
Many statistical procedures, including goodness-of-fit tests and methods for independent component a...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...