AbstractUsing an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new inequality for log mappings and apply it in information theory for the Shannon entropy and mutual information
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We introduce, under a parametric framework, a family of inequalities between mutual information and ...
Using an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new inequalit...
AbstractUsing an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new i...
AbstractA new analytic inequality for logarithms which provides a converse to arithmetic meangeometr...
In this paper we discuss new inequalities for logarithmic mapping and apply them in Information Theo...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
1. In [2,5,6,7] a.o. several interpretations of the inequality for all such that were given and the ...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
New inequalities for convex mappings of a real variable and applications in Information Theory for S...
Bounds for the logarithmic function are studied. In particular, we establish bounds with rational f...
Aim of this short note is to study Shannon's entropy power along entropic interpolations, thus gener...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We introduce, under a parametric framework, a family of inequalities between mutual information and ...
Using an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new inequalit...
AbstractUsing an inequality for convex functions by Andrica and Ra°a [1] (2.1), we point out a new i...
AbstractA new analytic inequality for logarithms which provides a converse to arithmetic meangeometr...
In this paper we discuss new inequalities for logarithmic mapping and apply them in Information Theo...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
Using the concavity property of the log mapping and the weighted arithmetic mean - geometric mean in...
1. In [2,5,6,7] a.o. several interpretations of the inequality for all such that were given and the ...
By the use of a counterpart inequality for Jensen's discrete inequality established in [1] for ...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
New inequalities for convex mappings of a real variable and applications in Information Theory for S...
Bounds for the logarithmic function are studied. In particular, we establish bounds with rational f...
Aim of this short note is to study Shannon's entropy power along entropic interpolations, thus gener...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We introduce, under a parametric framework, a family of inequalities between mutual information and ...