Suppose P is an arbitrary discrete distribution on a countable alphabet script X. Given an i.i.d. sample (X1 , . . . , Xn) drawn from P, we consider the problem of estimating the entropy H(P) or some other functional F = F(P) of the unknown distribution P. We show that, for additive functionals satisfying mild conditions (including the cases of the mean, the entropy, and mutual information), the plug-in estimates of F are universally consistent. We also prove that, without further assumptions, no rate-of-convergence results can be obtained for any sequence of estimators. In the case of entropy estimation, under a variety of different assumptions, we get rate-of-convergence results for the plug-in estimate and for a nonparametric estimator b...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
Entropy and its various generalizations are widely used in mathematical statistics, communication th...
Given an i.i.d. sample (X1, Xn) drawn from an unknown discrete distribution P on a countably infinit...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
We propose a general framework for the construction and analysis of minimax estimators for a wide cl...
We consider the estimation of the entropy of a discretely-supported time series through a plug-in es...
Entropy-type integral functionals of densities are widely used in mathematical statistics, informati...
Convergence properties of Shannon Entropy are studied. In the di erential setting, it is known that...
Suppose a string Xn1 = (X1, X2, . . ., Xn) is generated by a stationary memoryless source (X n)n≥1 w...
<p>We analyse a plug-in estimator for a large class of integral functionals of one or more continuou...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
Abstract — We consider algorithms for prediction, com-pression and entropy estimation in a universal...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
Entropy and its various generalizations are widely used in mathematical statistics, communication th...
Given an i.i.d. sample (X1, Xn) drawn from an unknown discrete distribution P on a countably infinit...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
We propose a general framework for the construction and analysis of minimax estimators for a wide cl...
We consider the estimation of the entropy of a discretely-supported time series through a plug-in es...
Entropy-type integral functionals of densities are widely used in mathematical statistics, informati...
Convergence properties of Shannon Entropy are studied. In the di erential setting, it is known that...
Suppose a string Xn1 = (X1, X2, . . ., Xn) is generated by a stationary memoryless source (X n)n≥1 w...
<p>We analyse a plug-in estimator for a large class of integral functionals of one or more continuou...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
Abstract — We consider algorithms for prediction, com-pression and entropy estimation in a universal...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
Entropy and its various generalizations are widely used in mathematical statistics, communication th...