The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties of the proposed measures. In particular, a convexity of Kullback–Leibler divergence with respect to states in product MV algebras is proved, and chain rules for mutual information and Kullback–Leibler divergence are established. In addition, the data processing inequality for conditionally independent partitions in product MV algebras is proved
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
summary:In this paper we apply the notion of the product $MV$-algebra in accordance with the definit...
The representation for measures of information which are symmetric, expansible, and have the branchi...
In the paper we propose, using the logical entropy function, a new kind of entropy in product MV-alg...
This paper is concerned with the mathematical modelling of Tsallis entropy in product MV-algebra dyn...
summary:In this paper we construct conditional states on semi-simple MV-algebras. We show that these...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
The main aim of this contribution is to define the notions of Kullback-Leibler divergence and condit...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
AbstractAlgebraic (or finitely correlated) states are translation-invariant states on an infinite te...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
The conditional probability distribution, γ, on a semi-simple MV-algebra, M, is an additive normed m...
We propose a notion of stochastic independence for probability MV-algebras, addressing an open probl...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
We consider the analysis of probability distributions through their associated covariance operators ...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
summary:In this paper we apply the notion of the product $MV$-algebra in accordance with the definit...
The representation for measures of information which are symmetric, expansible, and have the branchi...
In the paper we propose, using the logical entropy function, a new kind of entropy in product MV-alg...
This paper is concerned with the mathematical modelling of Tsallis entropy in product MV-algebra dyn...
summary:In this paper we construct conditional states on semi-simple MV-algebras. We show that these...
Accepted by IEEE Transactions on Information Theory. To appear.Rényi divergence is related to Rényi ...
The main aim of this contribution is to define the notions of Kullback-Leibler divergence and condit...
Abstract—Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is rel...
AbstractAlgebraic (or finitely correlated) states are translation-invariant states on an infinite te...
It is well known that in Information Theory and Machine Learning the Kullback-Leibler divergence, wh...
The conditional probability distribution, γ, on a semi-simple MV-algebra, M, is an additive normed m...
We propose a notion of stochastic independence for probability MV-algebras, addressing an open probl...
Entropy, conditional entropy and mutual information for discrete-valued random variables play impor...
We consider the analysis of probability distributions through their associated covariance operators ...
Information-theoretic measures, such as the entropy, the cross-entropy and the Kullback–Leibler dive...
summary:In this paper we apply the notion of the product $MV$-algebra in accordance with the definit...
The representation for measures of information which are symmetric, expansible, and have the branchi...