summary:Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
The Shannon entropy based on the probability density function is a key information measure with appl...
In this paper, we discuss the cumulative measure of inaccuracy in k-lower record values and study ch...
summary:In this paper the mean and the variance of the Maximum Likelihood Estimator (MLE) of Kullbac...
Numerous information indices have been developed in the information theoretic literature and extensi...
Generalized information measures play an important role in the measurement of uncertainty of certain...
summary:Kullback's relative information and Kerridge's inaccuracy are two information-theoretic meas...
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distr...
International audienceThe literature on inconsistency measures has ignored a distinction, that is, d...
In this note, using some refinements of Jensen’s discrete inequality,\ud we give some new refinement...
Finding the relationships between information measures and statistical constants leads to the applic...
Inaccuracy and information measures based on the cumulative residual entropy are useful in variou...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
The Shannon entropy based on the probability density function is a key information measure with appl...
In this paper, we discuss the cumulative measure of inaccuracy in k-lower record values and study ch...
summary:In this paper the mean and the variance of the Maximum Likelihood Estimator (MLE) of Kullbac...
Numerous information indices have been developed in the information theoretic literature and extensi...
Generalized information measures play an important role in the measurement of uncertainty of certain...
summary:Kullback's relative information and Kerridge's inaccuracy are two information-theoretic meas...
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distr...
International audienceThe literature on inconsistency measures has ignored a distinction, that is, d...
In this note, using some refinements of Jensen’s discrete inequality,\ud we give some new refinement...
Finding the relationships between information measures and statistical constants leads to the applic...
Inaccuracy and information measures based on the cumulative residual entropy are useful in variou...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
The Shannon entropy based on the probability density function is a key information measure with appl...
In this paper, we discuss the cumulative measure of inaccuracy in k-lower record values and study ch...