In the first part of this paper, we show the entropy measures defined from the distance, in the sense of Csiszar, between one distribution and a fixed distribution. We study the concavity, the non-negativity and the Dalton- Pielou condition, and we obtain an unique fixed distribution who verify this three conditions. We study consecutively the homogeneity of the proposed functions when the fixed distribution is the equiprobability distribution, obtaining the ¡f¡-entropy functional as a particular case of the measures studied in the first parto.En este trabajo se presentan las medidas de entropía que provienen de la distancia, en el sentido de Csiszar, entre una distribución y la distribución en la que todos los sucesos son equiprobables. ...
Zipf–Mandelbrot entropy has many applications in many applied sciences, for example, in information ...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
AbstractThis thesis deals with a certain set function called entropy and its ápplications to some pr...
A new kind of entropy will be introduced which generalizes both the differential entropy and the cum...
Shannon entropy of a probability measure P, defined as $- \int_X(dp/d \mu) \hspace{2} ln (dp/d \mu)d...
This article provides a completion to theories of information based on entropy, resolving a longstan...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
The classical Csiszar-Kullback inequality bounds the L^{1}-distance of two probability densities in ...
Investigating the entropy distance between the Wiener measure,Wt0,τ, and stationary Gaussian measure...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
As it is well known in statistical physics the stationary distribution can be obtained by maximizing...
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
Entropy is a key measure in studies related to information theory and its many applications. Campbel...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...
Zipf–Mandelbrot entropy has many applications in many applied sciences, for example, in information ...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
AbstractThis thesis deals with a certain set function called entropy and its ápplications to some pr...
A new kind of entropy will be introduced which generalizes both the differential entropy and the cum...
Shannon entropy of a probability measure P, defined as $- \int_X(dp/d \mu) \hspace{2} ln (dp/d \mu)d...
This article provides a completion to theories of information based on entropy, resolving a longstan...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
The classical Csiszar-Kullback inequality bounds the L^{1}-distance of two probability densities in ...
Investigating the entropy distance between the Wiener measure,Wt0,τ, and stationary Gaussian measure...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
It is well known that the entropy H(X) of a discrete random variable X is always greater than or equ...
As it is well known in statistical physics the stationary distribution can be obtained by maximizing...
A functional defined by means of entropy is considered. It is shown that it is a distance in the set...
Entropy is a key measure in studies related to information theory and its many applications. Campbel...
We provide a unifying axiomatics for R,enyi’s entropy and non-extensive entropy of Tsallis. It is sh...
Zipf–Mandelbrot entropy has many applications in many applied sciences, for example, in information ...
As entropy is also an important quantity in physics, we relate our results to physical processes by ...
AbstractThis thesis deals with a certain set function called entropy and its ápplications to some pr...