An upper bound is established for the entropy corresponding to a positive integer valued random variable X in terms of the expectation of certain functions of X. In particular, we show that the entropy is finite if E log X < ∞. Further, if Pr{X = n} is nonincreasing in n (n = 1, 2,…), then the entropy is finite only if E log X < ∞
AbstractWe investigate how the entropy numbers (en(T)) of an arbitrary Hölder-continuous operator T:...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
The maximum entropy principle is widely used to determine non-committal probabilities on a finite do...
An upper bound is established for the entropy corresponding to a positive integer valued random vari...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
1. In [2,5,6,7] a.o. several interpretations of the inequality for all such that were given and the ...
This paper concerns the folklore statement that ``entropy is a lower bound for compression''. More p...
AbstractThe ϵ entropy of the class F∗ of real-valued monotone functions from [0, 1] to [0, 1] in the...
It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the inte...
AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave d...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
An extension of the entropy power inequality to the form N_r^α (X + Y) ≥ N_r^α (X) + N_r^α (Y) with ...
AbstractThe behaviour of the entropy numbers ek(id:lnp→lnq), 0<p<q⩽∞, is well known (up to multiplic...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
AbstractWe investigate how the entropy numbers (en(T)) of an arbitrary Hölder-continuous operator T:...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
The maximum entropy principle is widely used to determine non-committal probabilities on a finite do...
An upper bound is established for the entropy corresponding to a positive integer valued random vari...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
The entropy H(X) of a discrete random variable X of alphabet size m is always non-negative and upper...
1. In [2,5,6,7] a.o. several interpretations of the inequality for all such that were given and the ...
This paper concerns the folklore statement that ``entropy is a lower bound for compression''. More p...
AbstractThe ϵ entropy of the class F∗ of real-valued monotone functions from [0, 1] to [0, 1] in the...
It is proven that a conjecture of Tao (2010) holds true for log-concave random variables on the inte...
AbstractWe prove that the Poisson distribution maximises entropy in the class of ultra log–concave d...
AbstractIn information theory, the fundamental tool is the entropy function, whose upper bound is de...
An extension of the entropy power inequality to the form N_r^α (X + Y) ≥ N_r^α (X) + N_r^α (Y) with ...
AbstractThe behaviour of the entropy numbers ek(id:lnp→lnq), 0<p<q⩽∞, is well known (up to multiplic...
A new measure L(α), called average code length of order α, has been defined and its relationship wit...
AbstractWe investigate how the entropy numbers (en(T)) of an arbitrary Hölder-continuous operator T:...
A characterization of the entropy —∫ f log f dx of a random variable is provided. If X is a random v...
The maximum entropy principle is widely used to determine non-committal probabilities on a finite do...