In this communication, we characterize a measure of information of type (α, β, γ) by taking certain axioms parallel to those considered earlier by Harvda and Charvat along with the recursive relation (1.7). Some properties of this measure are also studied. This measure includes Shannon information measure as a special cas
The Shannon entropy based on the probability density function is a key information measure with appl...
The quantitative-qualitative measure of information as given by Belis and Guiaşu is additive, the ad...
peer reviewedTo extend the classical Shannon entropy to non-additive measures, Marichal recently int...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized informat...
AbstractThis paper concerns an axiomatic characterization of information measures of dimension k. Th...
The concept of information functions of type β (β > 0) is introduced and discussed. By means of thes...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called ...
summary:Entropy of type $(\alpha, \beta)$ is characterized in this paper by an axiomatic approach. I...
In [2], [3] and [6], the information of a random variable ξ with respect to another random variable ...
Information Theory is studied from the following view points: (1) the theory of entropy as amount of...
A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
The Shannon entropy based on the probability density function is a key information measure with appl...
The quantitative-qualitative measure of information as given by Belis and Guiaşu is additive, the ad...
peer reviewedTo extend the classical Shannon entropy to non-additive measures, Marichal recently int...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized informat...
AbstractThis paper concerns an axiomatic characterization of information measures of dimension k. Th...
The concept of information functions of type β (β > 0) is introduced and discussed. By means of thes...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called ...
summary:Entropy of type $(\alpha, \beta)$ is characterized in this paper by an axiomatic approach. I...
In [2], [3] and [6], the information of a random variable ξ with respect to another random variable ...
Information Theory is studied from the following view points: (1) the theory of entropy as amount of...
A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
The Shannon entropy based on the probability density function is a key information measure with appl...
The quantitative-qualitative measure of information as given by Belis and Guiaşu is additive, the ad...
peer reviewedTo extend the classical Shannon entropy to non-additive measures, Marichal recently int...