summary:Kullback's relative information and Kerridge's inaccuracy are two information-theoretic measures associated with a pair of probability distributions of a discrete random variable. The authors study a generalized measure which in particular contains a parametric generalization of relative information and inaccuracy. Some important properties of this generalized measure along with an inversion theorem are also studied
The purpose of this manuscript is twofold: (i) to provide a family of inequalities that unifies the ...
summary:In this work a univariate random variable is considered which includes some important partic...
The Chernoff information between two probability measures is a statistical divergence measuring thei...
summary:Kullback's relative information and Kerridge's inaccuracy are two information-theoretic meas...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...
The representation for measures of information which are symmetric, expansible, and have the branchi...
summary:In this paper the mean and the variance of the Maximum Likelihood Estimator (MLE) of Kullbac...
AbstractThe aim of this article is to give an axiomatic characterization of the multivariate measure...
summary:Generalized length biased distribution is defined as $h(x)=\phi_r (x)f(x), x>0$, where $f(x)...
Some propositions add more information to bodies of propositions than do others. We start with intui...
The present communication describes a new generalised measure of useful directed divergence based on...
The paper discusses the philosophical conclusions, which the interrelation between quantum mechanics...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
AbstractIn statistical estimation problems measures between probability distributions play significa...
The quantitative-qualitative measure of information as given by Belis and Guiaşu is additive, the ad...
The purpose of this manuscript is twofold: (i) to provide a family of inequalities that unifies the ...
summary:In this work a univariate random variable is considered which includes some important partic...
The Chernoff information between two probability measures is a statistical divergence measuring thei...
summary:Kullback's relative information and Kerridge's inaccuracy are two information-theoretic meas...
The directed divergence of type β which generalizes Kullback's directed divergence or Information me...
The representation for measures of information which are symmetric, expansible, and have the branchi...
summary:In this paper the mean and the variance of the Maximum Likelihood Estimator (MLE) of Kullbac...
AbstractThe aim of this article is to give an axiomatic characterization of the multivariate measure...
summary:Generalized length biased distribution is defined as $h(x)=\phi_r (x)f(x), x>0$, where $f(x)...
Some propositions add more information to bodies of propositions than do others. We start with intui...
The present communication describes a new generalised measure of useful directed divergence based on...
The paper discusses the philosophical conclusions, which the interrelation between quantum mechanics...
Kullback-Leibler relative-entropy or KL-entropy of P with respect to R defined as ∫xlnddPRdP , where...
AbstractIn statistical estimation problems measures between probability distributions play significa...
The quantitative-qualitative measure of information as given by Belis and Guiaşu is additive, the ad...
The purpose of this manuscript is twofold: (i) to provide a family of inequalities that unifies the ...
summary:In this work a univariate random variable is considered which includes some important partic...
The Chernoff information between two probability measures is a statistical divergence measuring thei...