Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric analysis of Granger causality. In this paper, we derive analytical expressions for transfer entropy for the multivariate exponential, logistic, Pareto (type I – IV) and Burr distributions. The latter two fall into the class of fat-tailed distributions with power law properties, used frequently in biological, physical and actuarial sciences. We discover that the transfer entropy expressions for all four distributions are identical and depend merely on the multivariate distribution parameter and the number of distribution dimensions. Moreover, we find that in all four cases the transfer entropies are given by the same decreasing function of distri...
The family of q-Gaussian and q-exponential probability densities fit thestatistical behavior of dive...
We develop a general method for computing logarithmic and log-gamma expectations of distributions. A...
Entropy has been widely employed as a measure of variability for problems, such as machine learning ...
Granger causality in its linear form has been shown by Barnett, Barrett and Seth [Phys. Rev. Lett. 1...
The information-theoretical concept transfer entropy is an ideal measure for detecting conditional i...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
submitted to Communications in StatisticsIn this paper we derive the exact analytical expressions fo...
Transfer entropy, an information-theoretic measure of time-directed information trans-fer between jo...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
Exact forms of Rényi and Shannon entropies are determined for several multivariate distributions, in...
Granger causality is a statistical notion of causal influence based on prediction via vector autoreg...
AbstractIn this paper, we derive some monotonicity properties of generalized entropy functionals of ...
This paper shows that multivariate distributions can be characterized as maximum entropy (ME) models...
AbstractThis paper shows that multivariate distributions can be characterized as maximum entropy (ME...
Shannon entropy, maximum entropy principle, Kotz type multivariate distribution, Burr distribution, ...
The family of q-Gaussian and q-exponential probability densities fit thestatistical behavior of dive...
We develop a general method for computing logarithmic and log-gamma expectations of distributions. A...
Entropy has been widely employed as a measure of variability for problems, such as machine learning ...
Granger causality in its linear form has been shown by Barnett, Barrett and Seth [Phys. Rev. Lett. 1...
The information-theoretical concept transfer entropy is an ideal measure for detecting conditional i...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
submitted to Communications in StatisticsIn this paper we derive the exact analytical expressions fo...
Transfer entropy, an information-theoretic measure of time-directed information trans-fer between jo...
Statistical relationships among the variables of a complex system reveal a lot about its physical be...
Exact forms of Rényi and Shannon entropies are determined for several multivariate distributions, in...
Granger causality is a statistical notion of causal influence based on prediction via vector autoreg...
AbstractIn this paper, we derive some monotonicity properties of generalized entropy functionals of ...
This paper shows that multivariate distributions can be characterized as maximum entropy (ME) models...
AbstractThis paper shows that multivariate distributions can be characterized as maximum entropy (ME...
Shannon entropy, maximum entropy principle, Kotz type multivariate distribution, Burr distribution, ...
The family of q-Gaussian and q-exponential probability densities fit thestatistical behavior of dive...
We develop a general method for computing logarithmic and log-gamma expectations of distributions. A...
Entropy has been widely employed as a measure of variability for problems, such as machine learning ...