To quantify the statistical nature of lost information in communication channels mathematically, Shannon (1948) introduced a concept (known as the Shannon en-tropy) analogous to the entropy described in statistical thermodynamics. Let X be a non-negative random variable representing the lifetime of a component wit
The variance of Shannon information related to the random variable X, which is called varentropy, is...
In analogy with the cumulative residual entropy recently proposed by Wang et al. (2003a) and (200...
The cumulative entropy is a new measure of information, alternative to the classical differential e...
Ebrahimi and Pellerey (1995) and Ebrahimi (1996) proposed the Shannon residual entropy function as a...
Residual and past residual entropy functions concatenated with uncertainty measurements are predomin...
In this communication, we consider and study a generalized two parameters entropy of order statistic...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
Recently, cumulative residual entropy (CRE) has been found to be a new measure of information that p...
The generalized cumulative residual entropy is a recently defined dispersion measure. In this paper...
Interest in the informational content of truncation motivates the study of the residual entropy func...
The Shannon entropy based on the probability density function is a key information measure with appl...
The present study on the characterization of probability distributions using the residual entropy fu...
We propose a generalized cumulative residual information measure based on Tsallis entropy and its dy...
The cumulative entropy has been recently proposed as a measure of information that is alternative ...
The variance of Shannon information related to the random variable X, which is called varentropy, is...
In analogy with the cumulative residual entropy recently proposed by Wang et al. (2003a) and (200...
The cumulative entropy is a new measure of information, alternative to the classical differential e...
Ebrahimi and Pellerey (1995) and Ebrahimi (1996) proposed the Shannon residual entropy function as a...
Residual and past residual entropy functions concatenated with uncertainty measurements are predomin...
In this communication, we consider and study a generalized two parameters entropy of order statistic...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of informati...
Recently, cumulative residual entropy (CRE) has been found to be a new measure of information that p...
The generalized cumulative residual entropy is a recently defined dispersion measure. In this paper...
Interest in the informational content of truncation motivates the study of the residual entropy func...
The Shannon entropy based on the probability density function is a key information measure with appl...
The present study on the characterization of probability distributions using the residual entropy fu...
We propose a generalized cumulative residual information measure based on Tsallis entropy and its dy...
The cumulative entropy has been recently proposed as a measure of information that is alternative ...
The variance of Shannon information related to the random variable X, which is called varentropy, is...
In analogy with the cumulative residual entropy recently proposed by Wang et al. (2003a) and (200...
The cumulative entropy is a new measure of information, alternative to the classical differential e...