Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where P and R are probability measures on a measurable space (X, ), plays a basic role in the definitions of classical information measures. It overcomes a shortcoming of Shannon entropy – discrete case definition of which cannot be extended to nondiscrete case naturally. Further, entropy and other classical information measures can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. An important theorem in this respect is the Gelfand-Yaglom-Perez (GYP) Theorem which equips KL-entropy with a fundamental definition and can be stated as: measure-theoretic K...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
In many practical situations, we have only partial information about the probabilities. In some case...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a...
Shannon entropy of a probability measure P, defined as $- \int_X(dp/d \mu) \hspace{2} ln (dp/d \mu)d...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distr...
In this communication, we characterize a measure of information of type (α, β, γ) by taking certain ...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Our aim at this paper is to investigate properties of Shannon and REnyie entropy, Kullb...
Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized informat...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
In many practical situations, we have only partial information about the probabilities. In some case...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...
The measure-theoretic definition of Kullback-Leibler relative-entropy (or simply KL-entropy) plays a...
Shannon entropy of a probability measure P, defined as $- \int_X(dp/d \mu) \hspace{2} ln (dp/d \mu)d...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...
International audienceWe consider the maximum entropy problems associated with Rényi $Q$-entropy, su...
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distr...
In this communication, we characterize a measure of information of type (α, β, γ) by taking certain ...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
Our aim at this paper is to investigate properties of Shannon and REnyie entropy, Kullb...
Currently we are witnessing the revaluation of huge data recourses that should be analyzed carefully...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized informat...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
We introduce an axiomatic approach to entropies and relative entropies that relies only on minimal i...
The refinement axiom for entropy has been provocative in providing foundations of information theory...
In many practical situations, we have only partial information about the probabilities. In some case...
We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two ‘s...