There is no generally accepted definition for conditional Tsallis entropy. The standard definition of (unconditional) Tsallis entropy depends on a parameter α that converges to the Shannon entropy as α approaches 1. In this paper, we describe three proposed definitions of conditional Tsallis entropy suggested in the literature—their properties are studied and their values, as a function of α, are compared. We also consider another natural proposal for conditional Tsallis entropy and compare it with the existing ones. Lastly, we present an online tool to compute the four conditional Tsallis entropies, given the probability distributions and the value of the parameter α
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
* corresponding author The Tsallis nonextensive entropy of the statistical physics literature exactl...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Tsallis and Kaniadakis entropies are generalizing the Shannon entropy and have it as their limit whe...
The Tsallis entropy given for a positive parameter α can be considered as a generalization of ...
The entropy of Tsallis is a different measure of uncertainty for the Shannon entropy. The present wo...
The entropy of Tsallis is a different measure of uncertainty for the Shannon entropy. The present wo...
Conditional entropies are fundamental for evaluating the mutual information of random variables. The...
We generalize the conditional entropy without probability given by Benvenuti in [1] and we recognize...
Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis e...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Th...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Th...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
* corresponding author The Tsallis nonextensive entropy of the statistical physics literature exactl...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
Tsallis and Kaniadakis entropies are generalizing the Shannon entropy and have it as their limit whe...
The Tsallis entropy given for a positive parameter α can be considered as a generalization of ...
The entropy of Tsallis is a different measure of uncertainty for the Shannon entropy. The present wo...
The entropy of Tsallis is a different measure of uncertainty for the Shannon entropy. The present wo...
Conditional entropies are fundamental for evaluating the mutual information of random variables. The...
We generalize the conditional entropy without probability given by Benvenuti in [1] and we recognize...
Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis e...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Th...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
In recent decades, different definitions of conditional Rényi entropy (CRE) have been introduced. Th...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...
* corresponding author The Tsallis nonextensive entropy of the statistical physics literature exactl...
Here we assume a discrete random variable, possessing a one-to-one correspondence with the set of na...