We define information and uncertainty function of a family of continuous distributions. Their values are relative information and uncertainty of an observation from the given parametric family, their mean values are the generalized Fisher information and a new measure of variability, the score variance. In a series of examples we show why to use new concepts instead of the differential entropy
International audienceWe consider the cumulative residual entropy (CRE) a recently introduced measur...
One of the most popular concepts used to measure the risk and the uncertainty is the variance and/or...
A shift-dependent information measure is favorable to handle in some specific applied contexts such ...
The concept of entropy plays a crucial role in information theory. Many authors obtained several pro...
We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate con...
Introduction: It is a remarkable fact that we can assign a numerical measure to certain quantities, ...
A new kind of entropy will be introduced which generalizes both the differential entropy and the cum...
We give a survey of the basic statistical ideas underlying the definition of entropy in information ...
The relationship between three probability distributions and their maximizable entropy forms is disc...
The Shannon entropy based on the probability density function is a key information measure with appl...
Uncertainty relations are central to quantum physics. While they were originally formulated in terms...
In statistics, Fisher was the first to introduce the measure of the amount of information supplied b...
We propose a generalized cumulative residual information measure based on Tsallis entropy and its dy...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distr...
International audienceWe consider the cumulative residual entropy (CRE) a recently introduced measur...
One of the most popular concepts used to measure the risk and the uncertainty is the variance and/or...
A shift-dependent information measure is favorable to handle in some specific applied contexts such ...
The concept of entropy plays a crucial role in information theory. Many authors obtained several pro...
We show that a proper expression of the uncertainty relation for a pair of canonically-conjugate con...
Introduction: It is a remarkable fact that we can assign a numerical measure to certain quantities, ...
A new kind of entropy will be introduced which generalizes both the differential entropy and the cum...
We give a survey of the basic statistical ideas underlying the definition of entropy in information ...
The relationship between three probability distributions and their maximizable entropy forms is disc...
The Shannon entropy based on the probability density function is a key information measure with appl...
Uncertainty relations are central to quantum physics. While they were originally formulated in terms...
In statistics, Fisher was the first to introduce the measure of the amount of information supplied b...
We propose a generalized cumulative residual information measure based on Tsallis entropy and its dy...
We give a survey of the basic statistical ideas underlying the definition of entropy in information...
We introduce a three-parameter generalized normal distribution, which belongs to the Kotz type distr...
International audienceWe consider the cumulative residual entropy (CRE) a recently introduced measur...
One of the most popular concepts used to measure the risk and the uncertainty is the variance and/or...
A shift-dependent information measure is favorable to handle in some specific applied contexts such ...