AbstractIt was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory14, 662–664) that the properties of algorithmic complexity and Shannon entropy are similar. We investigate one aspect of this similarity. Namely, we are interested in linear inequalities that are valid for Shannon entropy and for Kolmogorov complexity. It turns out that (1) all linear inequalities that are valid for Kolmogorov complexity are also valid for Shannon entropy and vice versa; (2) all linear inequalities that are valid for Shannon entropy are valid for ranks of finite subsets of linear spaces; (3) the opposite statement is not true; Ingleton's inequality (1971, “Combinatorial Mathematics and Its Applications,” pp. 149–167. Academic Press, San Diego) is valid ...
We briefly survey some concepts related to empirical entropy --- normal numbers, de Bruijn sequences...
[[abstract]]A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon in...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
AbstractIt was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory14, 662–664) that the proper...
AbstractKolmogorov's very first paper on algorithmic information theory (Kolmogorov, Problemy pereda...
Preliminary versions of the paper published in 2020 and in 2021. The final (substantially revised) ...
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recu...
This thesis is dedicated to studying the theory of entropy and its relation to the Kolmogorov comple...
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to w...
There arose two successful formalisations of the quantitative aspect of information over the course ...
One of the most popular methods of estimating the complexity of networks is to measure the entropy o...
We study conditional linear information inequalities, i.e., linear inequalities for Shannon entropy ...
The paper deals with two similar inequalities: (1) 2K \Gamma hA;B;Ci \Delta 6 K \Gamma hA;Bi ...
Entropy inequalities are very important in information theory and they play a crucial role in variou...
AbstractWe consider the overgraph of the Kolmogorov entropy function and study whether it is a compl...
We briefly survey some concepts related to empirical entropy --- normal numbers, de Bruijn sequences...
[[abstract]]A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon in...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
AbstractIt was mentioned by Kolmogorov (1968, IEEE Trans. Inform. Theory14, 662–664) that the proper...
AbstractKolmogorov's very first paper on algorithmic information theory (Kolmogorov, Problemy pereda...
Preliminary versions of the paper published in 2020 and in 2021. The final (substantially revised) ...
Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recu...
This thesis is dedicated to studying the theory of entropy and its relation to the Kolmogorov comple...
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to w...
There arose two successful formalisations of the quantitative aspect of information over the course ...
One of the most popular methods of estimating the complexity of networks is to measure the entropy o...
We study conditional linear information inequalities, i.e., linear inequalities for Shannon entropy ...
The paper deals with two similar inequalities: (1) 2K \Gamma hA;B;Ci \Delta 6 K \Gamma hA;Bi ...
Entropy inequalities are very important in information theory and they play a crucial role in variou...
AbstractWe consider the overgraph of the Kolmogorov entropy function and study whether it is a compl...
We briefly survey some concepts related to empirical entropy --- normal numbers, de Bruijn sequences...
[[abstract]]A relation between Shannon entropy and Kerridge inaccuracy, which is known as Shannon in...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...