Entropy is a key measure in studies related to information theory and its many applications. Campbell for the first time recognized that the exponential of the Shannon's entropy is just the size of the sample space, when distribution is uniform. Here is the idea to study exponentials of Shannon's and those other entropy generalizations that involve logarithmic function for a probability distribution in general. In this paper, we introduce a measure of sample space, called 'entropic measure of a sample space', with respect to the underlying distribution. It is shown in both discrete and continuous cases that this new measure depends on the parameters of the distribution on the sample space - same sample space having different 'entropic measu...
In the field of non-extensive statistical mechanics it is common to focus more attention on the fami...
This PhD report deals with the estimation of both Shannon entropy of distributions from independent ...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Product probability property, known in the literature as statistical independence, is examined first...
summary:Entropy of type $(\alpha, \beta)$ is characterized in this paper by an axiomatic approach. I...
AbstractIn this paper, we study the Shannon's entropy and its applications in the regular exponentia...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
This article provides a completion to theories of information based on entropy, resolving a longstan...
A number of “normalized” measures ol entropy have been obtained to measure the “intrinsic” uncertain...
Entropies and entropy-like quantities play an increasing role in modern non-linear data analysis. Fi...
Entropies and entropy-like quantities are playing an increasing role in modern non-linear data analy...
By taking into account a geometrical interpretation of the measurement process [1, 2], we define a s...
Shannon entropy of a probability measure P, defined as $- \int_X(dp/d \mu) \hspace{2} ln (dp/d \mu)d...
The estimation of the entropy of a random system or process is of interest in many scientific applic...
In the field of non-extensive statistical mechanics it is common to focus more attention on the fami...
This PhD report deals with the estimation of both Shannon entropy of distributions from independent ...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Product probability property, known in the literature as statistical independence, is examined first...
summary:Entropy of type $(\alpha, \beta)$ is characterized in this paper by an axiomatic approach. I...
AbstractIn this paper, we study the Shannon's entropy and its applications in the regular exponentia...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
This article provides a completion to theories of information based on entropy, resolving a longstan...
A number of “normalized” measures ol entropy have been obtained to measure the “intrinsic” uncertain...
Entropies and entropy-like quantities play an increasing role in modern non-linear data analysis. Fi...
Entropies and entropy-like quantities are playing an increasing role in modern non-linear data analy...
By taking into account a geometrical interpretation of the measurement process [1, 2], we define a s...
Shannon entropy of a probability measure P, defined as $- \int_X(dp/d \mu) \hspace{2} ln (dp/d \mu)d...
The estimation of the entropy of a random system or process is of interest in many scientific applic...
In the field of non-extensive statistical mechanics it is common to focus more attention on the fami...
This PhD report deals with the estimation of both Shannon entropy of distributions from independent ...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...