For a generalized random variable X, a measure L¯X of average code length is defined. Using L¯X, some coding theorems for discrete noiseless channels are proved. Some new concepts like strongly decipherable codes, absolutely strongly optimal codes, etc. are also proposed. It is asserted that if the celebrated noiseless coding theorem of Shannon (for complete random variables) is to be extended properly for generalized random variables, then the coding must be done in a strongly decipherable way
Abstract. We will discuss entropy from the perspective of infor-mation theory. 1. Some coding termin...
Ahlswede R, Gemma J. Bounds on algebraic code capacities for noisy channels. II. Information and Con...
The problem of distortionless encoding, when the parameters of the probabilistic model of a source a...
Shannon's capacity formula for memoryless and finite-state noiseless channels is proved in a simple ...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
This paper considers the problem of efficient coding (in the information theory sense) for finite, d...
An improvement of the Noiseless Coding Theorem for certain probability distributions is given
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
In the present paper the generalized mean codeword length is studied and characterized a new general...
Proofs of some coding theorems, somewhat more general than Shannon's Fundamental Theorem, are g...
Theorems 1 and 8, from Shannon’s 1948 classic paper [1] both deal with the capacity of what Shannon ...
In information theory, Shannon’s Noisy-Channel Coding Theorem states that it is possible to communic...
The proof of a coding theorem for abstract memoryless channels is given for a general constraint on ...
ABSTRACT. Guiasu and Picard [I] introduced the mean length for ’useful ’ codes. They called this len...
We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called ...
Abstract. We will discuss entropy from the perspective of infor-mation theory. 1. Some coding termin...
Ahlswede R, Gemma J. Bounds on algebraic code capacities for noisy channels. II. Information and Con...
The problem of distortionless encoding, when the parameters of the probabilistic model of a source a...
Shannon's capacity formula for memoryless and finite-state noiseless channels is proved in a simple ...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
This paper considers the problem of efficient coding (in the information theory sense) for finite, d...
An improvement of the Noiseless Coding Theorem for certain probability distributions is given
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
In the present paper the generalized mean codeword length is studied and characterized a new general...
Proofs of some coding theorems, somewhat more general than Shannon's Fundamental Theorem, are g...
Theorems 1 and 8, from Shannon’s 1948 classic paper [1] both deal with the capacity of what Shannon ...
In information theory, Shannon’s Noisy-Channel Coding Theorem states that it is possible to communic...
The proof of a coding theorem for abstract memoryless channels is given for a general constraint on ...
ABSTRACT. Guiasu and Picard [I] introduced the mean length for ’useful ’ codes. They called this len...
We present a relation between Tsallis’s entropy and generalized Kerridge inaccuracy which is called ...
Abstract. We will discuss entropy from the perspective of infor-mation theory. 1. Some coding termin...
Ahlswede R, Gemma J. Bounds on algebraic code capacities for noisy channels. II. Information and Con...
The problem of distortionless encoding, when the parameters of the probabilistic model of a source a...