AbstractIn this paper, we derive some monotonicity properties of generalized entropy functionals of various multivariate distributions. These include the distributions of random eigenvalues arising in many hypothesis testing problems in multivariate analysis; the multivariate Liouville distributions; and the noncentral Wishart distributions
Entropy has been widely employed as a measure of variability for problems, such as machine learning ...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
For real a\u3e0, let Xa denote a random variable with the gamma distribution with parameters a and 1...
AbstractIn this paper, we derive some monotonicity properties of generalized entropy functionals of ...
AbstractA random vector X=(X1,X2,…,Xn) with positive components has a Liouville distribution with pa...
Exact forms of Rényi and Shannon entropies are determined for several multivariate distributions, in...
A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is sh...
In this paper a characterization is presented for Pearson's Type II and VII multivariate distributio...
Abstract. Entropy has been widely employed as an optimization func-tion for problems in computer vis...
AbstractThis paper shows that multivariate distributions can be characterized as maximum entropy (ME...
This paper shows that multivariate distributions can be characterized as maximum entropy (ME) models...
Shannon entropy, maximum entropy principle, Kotz type multivariate distribution, Burr distribution, ...
In physics, communication theory, engineering, statistics, and other areas, one of the methods of de...
The proof of consistency for the kth nearest neighbour distance estimator of the Shannon entropy fo...
Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric ana...
Entropy has been widely employed as a measure of variability for problems, such as machine learning ...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
For real a\u3e0, let Xa denote a random variable with the gamma distribution with parameters a and 1...
AbstractIn this paper, we derive some monotonicity properties of generalized entropy functionals of ...
AbstractA random vector X=(X1,X2,…,Xn) with positive components has a Liouville distribution with pa...
Exact forms of Rényi and Shannon entropies are determined for several multivariate distributions, in...
A simple multivariate version of Costa's entropy power inequality is proved. In particular, it is sh...
In this paper a characterization is presented for Pearson's Type II and VII multivariate distributio...
Abstract. Entropy has been widely employed as an optimization func-tion for problems in computer vis...
AbstractThis paper shows that multivariate distributions can be characterized as maximum entropy (ME...
This paper shows that multivariate distributions can be characterized as maximum entropy (ME) models...
Shannon entropy, maximum entropy principle, Kotz type multivariate distribution, Burr distribution, ...
In physics, communication theory, engineering, statistics, and other areas, one of the methods of de...
The proof of consistency for the kth nearest neighbour distance estimator of the Shannon entropy fo...
Transfer entropy is a frequently employed measure of conditional co-dependence in non-parametric ana...
Entropy has been widely employed as a measure of variability for problems, such as machine learning ...
Based on the Jaynes principle of maximum for informational entropy, we find a generalized probabilit...
For real a\u3e0, let Xa denote a random variable with the gamma distribution with parameters a and 1...