Let L(t) = t−1logD(∑piDtni})where pi is the probability of the ith input symbol to a noiseless channel, and ni is the length of the code sequence for the ith symbol in some uniquely decipherable code. Limiting values of L(t) are σ nipi for t = 0 and max (ni) for t = ∞. It is shown that L(t) has some desirable properties as a measure of typical code length. A coding theorem for a noiseless channel is proved. The theorem states roughly that it is possible to encode so that L(t) is close to Hα , where Hα is R∑nyi's entropy of order α and α = (1 + t)−1
International audienceWe discuss the interest of escort distributions and Rényi entropy in the conte...
AbstractThe problem of non-distorting compression (or coding) of sequences of symbols is considered....
An extension is presented to the source coding theorem traditionally based on the Shannon entropy an...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the ...
The problem of minimal expense coding (in particular for Rényi entropies, including the Shannon entr...
ABSTRACT. Guiasu and Picard [I] introduced the mean length for ’useful ’ codes. They called this len...
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
We introduce a quantity which is called Rényi’s-Tsalli’s entropy of order ξ and discussed some of it...
A new measure Lβα, called average code word length of order α and type β is defined and its relation...
A new measure,L called average code word length of order is defined and its relationship with ...
Abstract. We will discuss entropy from the perspective of infor-mation theory. 1. Some coding termin...
In this paper we propose a revisitation of the topic of unique decodability and of some of the relat...
In this correspondence we provide new bounds on the expected length L of a binary one-to-one code fo...
In present communication, we have developed the suitable constraints for the given the mean codeword...
International audienceWe discuss the interest of escort distributions and Rényi entropy in the conte...
AbstractThe problem of non-distorting compression (or coding) of sequences of symbols is considered....
An extension is presented to the source coding theorem traditionally based on the Shannon entropy an...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the ...
The problem of minimal expense coding (in particular for Rényi entropies, including the Shannon entr...
ABSTRACT. Guiasu and Picard [I] introduced the mean length for ’useful ’ codes. They called this len...
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
We introduce a quantity which is called Rényi’s-Tsalli’s entropy of order ξ and discussed some of it...
A new measure Lβα, called average code word length of order α and type β is defined and its relation...
A new measure,L called average code word length of order is defined and its relationship with ...
Abstract. We will discuss entropy from the perspective of infor-mation theory. 1. Some coding termin...
In this paper we propose a revisitation of the topic of unique decodability and of some of the relat...
In this correspondence we provide new bounds on the expected length L of a binary one-to-one code fo...
In present communication, we have developed the suitable constraints for the given the mean codeword...
International audienceWe discuss the interest of escort distributions and Rényi entropy in the conte...
AbstractThe problem of non-distorting compression (or coding) of sequences of symbols is considered....
An extension is presented to the source coding theorem traditionally based on the Shannon entropy an...