We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure d ( x , x ^ ) = | x ...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
The nonnegativity of relative entropy implies that the differential entropy of a random vector X wi...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We show that for log-concave real random variables with fixed variance the Shannon differential entr...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
Suppose that an infinite sequence is produced by independent trials of a random variable with a fixe...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
The nonnegativity of relative entropy implies that the differential entropy of a random vector X wi...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We show that for log-concave real random variables with fixed variance the Shannon differential entr...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
Suppose that an infinite sequence is produced by independent trials of a random variable with a fixe...
AbstractWe develop a reverse entropy power inequality for convex measures, which may be seen as an a...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
The nonnegativity of relative entropy implies that the differential entropy of a random vector X wi...