The mutual information is a measure of the dependence of two random variables. We propose an extension of this notion finalized to describe residual lifetime distributions, i.e. to measure the dependence between the remaining lifetimes of two systems, given that both systems have survived up to time t. Some examples of application of such a measure are presented
Measures of divergence or discrepancy are used either to measure mutual information con-cerning two ...
We propose new measures of shared information, unique information and synergistic information that c...
Recently, nonsymmetric measures of dependence have started to attract attention, and several continu...
The mutual information is a measure of the dependence of two random variables. We propose an exten...
We consider dynamic versions of the mutual information of lifetime distributions, with focus on past...
AbstractThis paper develops measures of information for multivariate distributions when their suppor...
A popular way to measure the degree of dependence between two random objects is by their mutual info...
The present communication considers a dynamic measure of inaccu-racy between two residual lifetime d...
A deluge of data is transforming science and industry. Many hope that this massive flux of informat...
The mutual information is useful measure of a random vector component dependence. It is important in...
In a variety of applicative fields the level of information in randomquantities is commonly measured...
The nature of dependence between random variables has always been the subject of many statistical pr...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
We investigate the dependence properties of a vector of residual lifetimes by means of the copula as...
One of the fundamental aspects when working with batteries of statistic tests is that they should be...
Measures of divergence or discrepancy are used either to measure mutual information con-cerning two ...
We propose new measures of shared information, unique information and synergistic information that c...
Recently, nonsymmetric measures of dependence have started to attract attention, and several continu...
The mutual information is a measure of the dependence of two random variables. We propose an exten...
We consider dynamic versions of the mutual information of lifetime distributions, with focus on past...
AbstractThis paper develops measures of information for multivariate distributions when their suppor...
A popular way to measure the degree of dependence between two random objects is by their mutual info...
The present communication considers a dynamic measure of inaccu-racy between two residual lifetime d...
A deluge of data is transforming science and industry. Many hope that this massive flux of informat...
The mutual information is useful measure of a random vector component dependence. It is important in...
In a variety of applicative fields the level of information in randomquantities is commonly measured...
The nature of dependence between random variables has always been the subject of many statistical pr...
Starting with a new formulation for the mutual information (MI) between a pair of events, this paper...
We investigate the dependence properties of a vector of residual lifetimes by means of the copula as...
One of the fundamental aspects when working with batteries of statistic tests is that they should be...
Measures of divergence or discrepancy are used either to measure mutual information con-cerning two ...
We propose new measures of shared information, unique information and synergistic information that c...
Recently, nonsymmetric measures of dependence have started to attract attention, and several continu...