AbstractThe program complexity to enumerate a finite set of words is found. The complexity is either an exponential or a linear function of the word length depending on whether the redundancy is either less or more than 100%. A corollary: the Varshamov-Gilbert bound of the group error correcting code cardinality is tight for almost any channel with additive noise. The proofs are based on the concept of the collision index
The problem of minimal expense coding (in particular for Rényi entropies, including the Shannon entr...
Huffman's algorithm gives optimal codes, as measured by average codeword length, and the redundancy ...
This is a presentation about joint work between Hector Zenil and Jean-Paul Delahaye. Zenil presents ...
AbstractThe program complexity to enumerate a finite set of words is found. The complexity is either...
International audienceThis paper sheds light on universal coding with respect to classes of memoryle...
According to the Kolmogorov complexity, every finite binary string is compressible to a shortest cod...
33 pagesInternational audienceThis paper describes universal lossless coding strategies for compress...
Compression, estimation, and prediction are basic problems in Information theory, statistics and mac...
AbstractThe problem of non-distorting compression (or coding) of sequences of symbols is considered....
We first consider communication complexity which arises in applications where a system needs to comp...
A general framework for data compression, in which computational resource bounds are introduced at b...
AbstractThis paper extends the usual notion of abstract program size complexity, studied by Kolmogor...
Abstract—A quantity called the finite-state complexity is assigned to every infinite sequence of ele...
Cataloged from PDF version of article.We present an upper bound on the zero-error list-coding capac...
This paper continues the study of algebraic code capacities, which were introduced by Ahlswede (1971...
The problem of minimal expense coding (in particular for Rényi entropies, including the Shannon entr...
Huffman's algorithm gives optimal codes, as measured by average codeword length, and the redundancy ...
This is a presentation about joint work between Hector Zenil and Jean-Paul Delahaye. Zenil presents ...
AbstractThe program complexity to enumerate a finite set of words is found. The complexity is either...
International audienceThis paper sheds light on universal coding with respect to classes of memoryle...
According to the Kolmogorov complexity, every finite binary string is compressible to a shortest cod...
33 pagesInternational audienceThis paper describes universal lossless coding strategies for compress...
Compression, estimation, and prediction are basic problems in Information theory, statistics and mac...
AbstractThe problem of non-distorting compression (or coding) of sequences of symbols is considered....
We first consider communication complexity which arises in applications where a system needs to comp...
A general framework for data compression, in which computational resource bounds are introduced at b...
AbstractThis paper extends the usual notion of abstract program size complexity, studied by Kolmogor...
Abstract—A quantity called the finite-state complexity is assigned to every infinite sequence of ele...
Cataloged from PDF version of article.We present an upper bound on the zero-error list-coding capac...
This paper continues the study of algebraic code capacities, which were introduced by Ahlswede (1971...
The problem of minimal expense coding (in particular for Rényi entropies, including the Shannon entr...
Huffman's algorithm gives optimal codes, as measured by average codeword length, and the redundancy ...
This is a presentation about joint work between Hector Zenil and Jean-Paul Delahaye. Zenil presents ...