Entropy inequalities play a central role in proving converse coding theorems for network information theoretic problems. This thesis studies two new aspects of entropy inequalities. First, inequalities relating average joint entropies rather than entropies over individual subsets are studied. It is shown that the closures of the average entropy regions where the averages are over all subsets of the same size and all sliding windows of the same size respectively are identical, implying that averaging over sliding windows always suffices as far as unconstrained entropy inequalities are concerned. Second, the existence of non-Shannon type inequalities under partial symmetry is studied using the concepts of Shannon and non-Shannon groups. A com...
In this article, we discuss the problem of establishing relations between information measures for n...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
This report presents the result recently published in [1] that establishes a one-to-one corresponden...
The entropy region is constructed from vectors of random variables by collecting Shannon entropies o...
Entropy inequalities are very important in information theory and they play a crucial role in variou...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
Entropy and conditional mutual information are the key quantities information theory provides to mea...
Given n (discrete or continuous) random variables X_i, the (2^n – 1)-dimensional vector obtained by ...
Entropy and information can be considered dual: entropy is a measure of the subspace defined by the ...
En este trabajo se hace un estudio de una relación encontrada en el 2002 por T. H. Chan y R. W. Yeu...
Entropy was introduced first in thermodynamics and statistical mechanics, as well as information the...
We show that a large class of network information theory problems can be cast as convex optimization...
The problem of determining the region of entropic vectors is a central one in information theory. Re...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
In this article, we discuss the problem of establishing relations between information measures for n...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
This report presents the result recently published in [1] that establishes a one-to-one corresponden...
The entropy region is constructed from vectors of random variables by collecting Shannon entropies o...
Entropy inequalities are very important in information theory and they play a crucial role in variou...
Abstract—Upper and lower bounds are obtained for the joint entropy of a collection of random variabl...
Entropy and conditional mutual information are the key quantities information theory provides to mea...
Given n (discrete or continuous) random variables X_i, the (2^n – 1)-dimensional vector obtained by ...
Entropy and information can be considered dual: entropy is a measure of the subspace defined by the ...
En este trabajo se hace un estudio de una relación encontrada en el 2002 por T. H. Chan y R. W. Yeu...
Entropy was introduced first in thermodynamics and statistical mechanics, as well as information the...
We show that a large class of network information theory problems can be cast as convex optimization...
The problem of determining the region of entropic vectors is a central one in information theory. Re...
Information plays an important role in our understanding of the physical world. Hence we propose an ...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
In this article, we discuss the problem of establishing relations between information measures for n...
The demands for machine learning and knowledge extraction methods have been booming due to the unpre...
This report presents the result recently published in [1] that establishes a one-to-one corresponden...