We show that for log-concave real random variables with fixed variance the Shannon differential entropy is minimized for an exponential random variable. We apply this result to derive upper bounds on capacities of additive noise channels with log-concave noise. We also improve constants in the reverse entropy power inequalities for log-concave random variables
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Information concentration of probability measures have important implications in learning theory. Re...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the inte...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave d...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Information concentration of probability measures have important implications in learning theory. Re...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of t...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We derive a lower bound on the differential entropy for symmetric log-concave random variable X in t...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
We prove that the reciprocal of Fisher information of a logconcave probability density is concave in...
It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the inte...
We prove a quantitative dimension-free bound in the Shannon{Stam en- tropy inequality for the convol...
AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave d...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Using a sharp version of the reverse Young inequality, and a Renyi entropy comparison result due to ...
Information concentration of probability measures have important implications in learning theory. Re...