We deal with conditional decomposable information measures, directly defined as functions on a suitable set of conditional events satisfying a class of axioms. For these general measures we introduce a notion of independence and study its main properties in order to compare it with classical definitions present in the literature. The particular case of Wiener-Shannon information measure is taken in consideration and the links between the provided independence for information measures and the independence for the underlying probability are analyzed
This paper introduces the nolions of independence and conditional independence in valuation-based sy...
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon informa...
We propose new measures of shared information, unique information and synergistic information that c...
The bounds of classical formulation of the independence axiom in the generalized information theory ...
This paper summarizes recent investigations into the nature of informational dependencies and their ...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized informat...
Upper and lower conditional probabilities assigned by Hausdorff outer and inner measures are given;...
Independence is a basic concept of probability theory and statistics. In a lot of fields of sciences...
Shannon's mutual information for discrete random variables has been generalized to random ensembles ...
A definition of stochastic independence which avoids the inconsistencies (related to events of proba...
AbstractUpper and lower conditional probabilities are defined by Hausdorff outer and inner measures,...
AbstractThe logical and algorithmic properties of stable conditional independence (CI) as an alterna...
We recall a concept called sub-independence, which is defined in terms of the convolution of the dis...
AbstractIn the probabilistic theory of information, measures of information depend only upon the pro...
We propose new measures of shared information, unique information and synergistic information that c...
This paper introduces the nolions of independence and conditional independence in valuation-based sy...
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon informa...
We propose new measures of shared information, unique information and synergistic information that c...
The bounds of classical formulation of the independence axiom in the generalized information theory ...
This paper summarizes recent investigations into the nature of informational dependencies and their ...
Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized informat...
Upper and lower conditional probabilities assigned by Hausdorff outer and inner measures are given;...
Independence is a basic concept of probability theory and statistics. In a lot of fields of sciences...
Shannon's mutual information for discrete random variables has been generalized to random ensembles ...
A definition of stochastic independence which avoids the inconsistencies (related to events of proba...
AbstractUpper and lower conditional probabilities are defined by Hausdorff outer and inner measures,...
AbstractThe logical and algorithmic properties of stable conditional independence (CI) as an alterna...
We recall a concept called sub-independence, which is defined in terms of the convolution of the dis...
AbstractIn the probabilistic theory of information, measures of information depend only upon the pro...
We propose new measures of shared information, unique information and synergistic information that c...
This paper introduces the nolions of independence and conditional independence in valuation-based sy...
We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon informa...
We propose new measures of shared information, unique information and synergistic information that c...