AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave distributions, extending a result of Harremoës. The proof uses ideas concerning log-concavity, and a semigroup action involving adding Poisson variables and thinning. We go on to show that the entropy is a concave function along this semigroup
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave d...
We prove that the Poisson distribution maximises entropy in the class of ultra log-concave distribut...
Sufficient conditions are developed, under which the compound Poisson distribution has maximal entro...
Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisso...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We show that for log-concave real random variables with fixed variance the Shannon differential entr...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the inte...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave d...
We prove that the Poisson distribution maximises entropy in the class of ultra log-concave distribut...
Sufficient conditions are developed, under which the compound Poisson distribution has maximal entro...
Motivated, in part, by the desire to develop an information-theoretic foundation for compound Poisso...
We utilize and extend a simple and classical mechanism, combining log-concavity and majorization in ...
We show that for log-concave real random variables with fixed variance the Shannon differential entr...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
For statistical systems that violate one of the four Shannon–Khinchin axioms, entropy takes a more g...
We prove that the exponent of the entropy of one-dimensional projections of a log-concave random vec...
It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the inte...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...