We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d(x,x^)=|x−x^|r , with r ≥ 1 , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most log(√(πe)) ≈ 1.5 bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most log(√((πe)/2)) ≈ 1 bit, regardless of d. We also provide bounds on the capacity of me...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
in Lectures Notes in Mathematics, n°2116Chaining techniques show that if X is an isotropic log-conca...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We show that for log-concave real random variables with fixed variance the Shannon differential entr...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
in Lectures Notes in Mathematics, n°2116Chaining techniques show that if X is an isotropic log-conca...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We show that for log-concave real random variables with fixed variance the Shannon differential entr...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
in Lectures Notes in Mathematics, n°2116Chaining techniques show that if X is an isotropic log-conca...