Mutual information of two random variables can be easily obtained from their Shannon entropies. However, when nonadditive entropies are involved, the calculus of the mutual information is more complex. Here we discuss the basic matter about information from Shannon entropy. Then we analyse the case of the generalized nonadditive Tsallis entropy
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
In information theory, one major goal is to find useful functions that summarize the amount of infor...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
In [1], we have discussed the mutual information of two random variables and how it can be obtained ...
In [10.18483/ijSci.8451], we have discussed the mutual information of two random variables and how i...
The role of Tsalli's non-extensive Information Measure within an a la Jaynes Information-Theory-base...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...
The role of Tsalli's non-extensive Information Measure within an à la Jaynes Information-Theory-base...
After Shannon, entropy becomes a fundamental quantity to describe not only uncertainity or chaos of ...
The role of Tsalli's non-extensive Information Measure within an à la Jaynes Information-Theory-base...
Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis e...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
In a previous note (1) we suggested using P(x)P(p/x) where P(x)=W(x)W(x) and P(p/x) = a(p)exp(ipx)/W...
Noise-aided information transmission via stochastic resonance is shown and analyzed in a binary chan...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
In information theory, one major goal is to find useful functions that summarize the amount of infor...
Mutual information of two random variables can be easily obtained from their Shannon entropies. Howe...
In [1], we have discussed the mutual information of two random variables and how it can be obtained ...
In [10.18483/ijSci.8451], we have discussed the mutual information of two random variables and how i...
The role of Tsalli's non-extensive Information Measure within an a la Jaynes Information-Theory-base...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...
The role of Tsalli's non-extensive Information Measure within an à la Jaynes Information-Theory-base...
After Shannon, entropy becomes a fundamental quantity to describe not only uncertainity or chaos of ...
The role of Tsalli's non-extensive Information Measure within an à la Jaynes Information-Theory-base...
Non-extensive statistical mechanics appears as a powerful way to describe complex systems. Tsallis e...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
In a previous note (1) we suggested using P(x)P(p/x) where P(x)=W(x)W(x) and P(p/x) = a(p)exp(ipx)/W...
Noise-aided information transmission via stochastic resonance is shown and analyzed in a binary chan...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
In information theory, one major goal is to find useful functions that summarize the amount of infor...