Aim of this short note is to study Shannon's entropy power along entropic interpolations, thus generalizing Costa's concavity theorem. We shall provide two proofs of independent interest: the former by G-calculus, hence applicable to more abstract frameworks; the latter with an explicit remainder term, reminiscent of Villani (IEEE Trans. Inf. Theory, 2006), allowing us to characterize the case of equality
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
Product probability property, known in the literature as statistical independence, is examined first...
Recently, Savaré-Toscani proved that the Rényi entropy power of general probability densities solvin...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is sh...
An extension of the entropy power inequality to the form N_r^α (X + Y) ≥ N_r^α (X) + N_r^α (Y) with ...
An extension of the entropy power inequality to the form N_r^α (X + Y) ≥ N_r^α (X) + N_r^α (Y) with ...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~x) where h(\Delt...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
Product probability property, known in the literature as statistical independence, is examined first...
Recently, Savaré-Toscani proved that the Rényi entropy power of general probability densities solvin...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
We show that an information-theoretic property of Shannon's entropy power, known as concavity of ent...
A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is sh...
An extension of the entropy power inequality to the form N_r^α (X + Y) ≥ N_r^α (X) + N_r^α (Y) with ...
An extension of the entropy power inequality to the form N_r^α (X + Y) ≥ N_r^α (X) + N_r^α (Y) with ...
International audienceSeveral entropies are generalizing the Shannon entropy and have it as their li...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
International audienceA framework for deriving Rényi entropy-power inequalities (EPIs) is presented ...
We prove the following generalization of the Entropy Power Inequality: h(Ax) h(A~x) where h(\Delt...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
Product probability property, known in the literature as statistical independence, is examined first...