A set of Fisher information properties are presented in order to draw a parallel with similar properties of Shannon differential entropy. Already known properties are presented together with new ones, which include: (i) a generalization of mutual information for Fisher information; (ii) a new proof that Fisher information increases under conditioning; (iii) showing that Fisher information decreases in Markov chains; and (iv) bound estimation error using Fisher information. This last result is especially important, because it completes Fano’s inequality, i.e., a lower bound for estimation error, showing that Fisher information can be used to define an upper bound for this error. In this way, it is shown that Shannon’s differential entropy, w...
Abstract. The objective of this paper is to give some properties for the Fisher information measure ...
International audienceIn this paper, we propose generalizations of the de Bruijn identity based on e...
We provide a new perspective on Stein's so-called density approach by introducing a new operator and...
A set of Fisher information properties are presented in order to draw a parallel with similar proper...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
In information theory, Fisher information and Shannon information (entropy) are respectively used to...
We provide a new perspective on Stein's so-called density approach by introducing a new operator and...
International audienceIn this communication, we describe some interrelations between generalized q-e...
In statistics, Fisher was the first to introduce the measure of the amount of information supplied b...
Fanos inequality is a sharp upper bound on conditional entropy in terms of the probability of error....
We introduce, under a parametric framework, a family of inequalities between mutual information and ...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
Fano’s inequality has proven to be one important result in Shannon’s information theory having found...
In this paper we introduce a new generalisation of the relative Fisher Information for Markov jump p...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...
Abstract. The objective of this paper is to give some properties for the Fisher information measure ...
International audienceIn this paper, we propose generalizations of the de Bruijn identity based on e...
We provide a new perspective on Stein's so-called density approach by introducing a new operator and...
A set of Fisher information properties are presented in order to draw a parallel with similar proper...
AbstractEntropy, conditional entropy and mutual information for discrete-valued random variables pla...
In information theory, Fisher information and Shannon information (entropy) are respectively used to...
We provide a new perspective on Stein's so-called density approach by introducing a new operator and...
International audienceIn this communication, we describe some interrelations between generalized q-e...
In statistics, Fisher was the first to introduce the measure of the amount of information supplied b...
Fanos inequality is a sharp upper bound on conditional entropy in terms of the probability of error....
We introduce, under a parametric framework, a family of inequalities between mutual information and ...
Shannon entropy of a probability distribution gives a weighted mean of a measure of information that...
Fano’s inequality has proven to be one important result in Shannon’s information theory having found...
In this paper we introduce a new generalisation of the relative Fisher Information for Markov jump p...
The objective of this note is to report some potentially useful mutual information inequalities. 1 P...
Abstract. The objective of this paper is to give some properties for the Fisher information measure ...
International audienceIn this paper, we propose generalizations of the de Bruijn identity based on e...
We provide a new perspective on Stein's so-called density approach by introducing a new operator and...