Abstract. We will discuss entropy from the perspective of infor-mation theory. 1. Some coding terminology For what follows it may be helpful to imagine that you are trying to encode a page of text written in English into a binary sequence for digital transmission or storage. Also make the unrealistic assumption that letters of the alphabet occur independent with some probability distribution. For a set A, let A ∗ denote the set of all finite sequences of A. Some-times it will be convenient to think of elements of A ∗ as words. A pre-fix of a word a = (a1,..., an) ∈ A ∗ is a word of the form (a1,..., ak) for some 1 ≤ k ≤ n. Let S be a finite set, and p be a probability measure on S. We think of S as a source alphabet. A map c: S → {0, 1} ∗ ...
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
In this paper we propose a revisitation of the topic of unique decodability and of some fundamental ...
An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the ...
We introduce a quantity which is called Rényi’s-Tsalli’s entropy of order ξ and discussed some of it...
Shannon's capacity formula for memoryless and finite-state noiseless channels is proved in a simple ...
ABSTRACT. Guiasu and Picard [I] introduced the mean length for ’useful ’ codes. They called this len...
Let L(t) = t−1logD(∑piDtni})where pi is the probability of the ith input symbol to a noiseless chann...
Information- Self information, Shannon’s Entropy, joint and conditional entropies, mutual informatio...
Abstract—A quantity called the finite-state complexity is assigned to every infinite sequence of ele...
For a generalized random variable X, a measure L¯X of average code length is defined. Using L¯X, som...
In this paper we consider the use of variable length non prefix-free codes for coding constrained se...
International audienceThis paper describes a family of codes for entropy coding of memoryless source...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
International audienceMotivated from the fact that universal source coding on countably infinite alp...
We study universal compression for discrete data sequences that were corrupted by noise. We show tha...
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
In this paper we propose a revisitation of the topic of unique decodability and of some fundamental ...
An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the ...
We introduce a quantity which is called Rényi’s-Tsalli’s entropy of order ξ and discussed some of it...
Shannon's capacity formula for memoryless and finite-state noiseless channels is proved in a simple ...
ABSTRACT. Guiasu and Picard [I] introduced the mean length for ’useful ’ codes. They called this len...
Let L(t) = t−1logD(∑piDtni})where pi is the probability of the ith input symbol to a noiseless chann...
Information- Self information, Shannon’s Entropy, joint and conditional entropies, mutual informatio...
Abstract—A quantity called the finite-state complexity is assigned to every infinite sequence of ele...
For a generalized random variable X, a measure L¯X of average code length is defined. Using L¯X, som...
In this paper we consider the use of variable length non prefix-free codes for coding constrained se...
International audienceThis paper describes a family of codes for entropy coding of memoryless source...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
International audienceMotivated from the fact that universal source coding on countably infinite alp...
We study universal compression for discrete data sequences that were corrupted by noise. We show tha...
Abstract. A new measure L β α, called average code word length of order α and type β has been define...
In this paper we propose a revisitation of the topic of unique decodability and of some fundamental ...
An inequality concerning Kullback's I-divergence is applied to obtain a necessary condition for the ...