Properites of the the conditional entropy are studied and it is shown that Hartley's conditional entropy satisfies a proper subset of the desirable properties; this justifies the adoption of Shannon's entropy
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
Abstract—We provide a simple physical interpretation, in the context of the second law of thermodyna...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
The Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy...
The authors have characterized axiomatically the Shannon entropy (which is a symmetric function of i...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
In [2], [3] and [6], the information of a random variable ξ with respect to another random variable ...
The routine definitions of Shannon entropy for both discrete and continuous probability laws show in...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
Shannon (1948) showed that, by a proper choice of the conditional probabilities of the symbols in a ...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
Using a definition of conditional entropy given by Hanen and Neveu [5, 10, 11] we discuss in this pa...
We generalize the conditional entropy without probability given by Benvenuti in [1] and we recognize...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
Abstract—We provide a simple physical interpretation, in the context of the second law of thermodyna...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
The Rényi entropy of general order unifies the well-known Shannon entropy with several other entropy...
The authors have characterized axiomatically the Shannon entropy (which is a symmetric function of i...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
In [2], [3] and [6], the information of a random variable ξ with respect to another random variable ...
The routine definitions of Shannon entropy for both discrete and continuous probability laws show in...
We start with a clear distinction between Shannon’s Measure of Information (SMI) and the Thermodynam...
There is no generally accepted definition for conditional Tsallis entropy. The standard definition o...
Shannon (1948) showed that, by a proper choice of the conditional probabilities of the symbols in a ...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
Using a definition of conditional entropy given by Hanen and Neveu [5, 10, 11] we discuss in this pa...
We generalize the conditional entropy without probability given by Benvenuti in [1] and we recognize...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
The logical basis for information theory is the newly developed logic of partitions that is dual to ...
Abstract—We provide a simple physical interpretation, in the context of the second law of thermodyna...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...