[[abstract]]A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is well known in information theory. In this communication, first we generalized Shannon inequality and then given its application in coding theory and discuss some particular cases
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
Shannon's theory is commonly aboarded in the narrow statement of the general communication scheme : ...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is...
We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called ...
We introduce a quantity which is called Rényi’s-Tsalli’s entropy of order ξ and discussed some of it...
The concept of information theory originated when an attempt was made to create a theoretical model ...
AbstractTsallis relative operator entropy is defined and then its properties are given. Shannon ineq...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
Abstract—We provide a simple physical interpretation, in the context of the second law of thermodyna...
International audienceWhat is Shannon’s information theory (IT)? Despite its continued impact on our...
Abstract—A simple proof for the Shannon coding theorem, using only the Markov inequality, is present...
This book is an updated version of the information theory classic, first published in 1990. About on...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
Information- Self information, Shannon’s Entropy, joint and conditional entropies, mutual informatio...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
Shannon's theory is commonly aboarded in the narrow statement of the general communication scheme : ...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...
A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon inequality, is...
We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called ...
We introduce a quantity which is called Rényi’s-Tsalli’s entropy of order ξ and discussed some of it...
The concept of information theory originated when an attempt was made to create a theoretical model ...
AbstractTsallis relative operator entropy is defined and then its properties are given. Shannon ineq...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
Abstract—We provide a simple physical interpretation, in the context of the second law of thermodyna...
International audienceWhat is Shannon’s information theory (IT)? Despite its continued impact on our...
Abstract—A simple proof for the Shannon coding theorem, using only the Markov inequality, is present...
This book is an updated version of the information theory classic, first published in 1990. About on...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
Information- Self information, Shannon’s Entropy, joint and conditional entropies, mutual informatio...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
Shannon's theory is commonly aboarded in the narrow statement of the general communication scheme : ...
We live in the information age. Claude Shannon, as the father of the information age, gave us a theo...