Shannon’s entropy is one of the building blocks of information theory and an essential aspect of Machine Learning (ML) methods (e.g., Random Forests). Yet, it is only finitely defined for distributions with fast decaying tails on a countable alphabet. The unboundedness of Shannon’s entropy over the general class of all distributions on an alphabet prevents its potential utility from being fully realized. To fill the void in the foundation of information theory, Zhang (2020) proposed generalized Shannon’s entropy, which is finitely defined everywhere. The plug-in estimator, adopted in almost all entropy-based ML method packages, is one of the most popular approaches to estimating Shannon’s entropy. The asymptotic distribution for Shannon’s e...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
Abstract—Entropy rate of sequential data-streams naturally quantifies the complexity of the generati...
Shannon's entropy plays a central role in many fields of mathematics. In the first chapter, we prese...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
International audience—We study entropy rates of random sequences for general entropy functionals in...
We present a new class of estimators of Shannon entropy for severely undersampleddiscrete distributi...
We consider the estimation of the entropy of a discretely-supported time series through a plug-in es...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
summary:To study the asymptotic properties of entropy estimates, we use a unified expression, called...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
Abstract—Entropy rate of sequential data-streams naturally quantifies the complexity of the generati...
Shannon's entropy plays a central role in many fields of mathematics. In the first chapter, we prese...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
This work addresses the problem of Shannon entropy estimation in countably infinite alphabets studyi...
A new nonparametric estimator of Shannon’s entropy on a countable alphabet is proposed and analyzed ...
International audience—We study entropy rates of random sequences for general entropy functionals in...
We present a new class of estimators of Shannon entropy for severely undersampleddiscrete distributi...
We consider the estimation of the entropy of a discretely-supported time series through a plug-in es...
This paper introduces a class of k-nearest neighbor (k-NN) estimators called bi-partite plug-in (BPI...
summary:To study the asymptotic properties of entropy estimates, we use a unified expression, called...
Consider the problem of estimating the Shannon entropy of a distribution over k elements from n inde...
Many algorithms of machine learning use an entropy measure as optimization criterion. Among the wide...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
It was recently shown that estimating the Shannon entropy H(p) of a discrete k-symbol distribution p...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
Abstract—Entropy rate of sequential data-streams naturally quantifies the complexity of the generati...