We consider Bayesian estimation of information-theoretic quantities from data, using a Dirichlet prior. Acknowledging the uncertainty of the event space size m and the Dirichlet prior’s concentration parameter c, we treat both as random variables set by a hyperprior. We show that the associated hyperprior, P(c, m), obeys a simple “Irrelevance of Unseen Variables” (IUV) desideratum iff P(c, m) = P(c)P(m). Thus, requiring IUV greatly reduces the number of degrees of freedom of the hyperprior. Some information-theoretic quantities can be expressed multiple ways, in terms of different event spaces, e.g., mutual information. With all hyperpriors (implicitly) used in earlier work, different choices of this event space lead to different po...
We consider the analysis of probability distributions through their associated covariance operators ...
Given the joint chances of a pair of random variables one can compute quantities of interest, like t...
Given the joint chances of a pair of random variables one can compute quantities of interest, like t...
In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribu...
We study properties of popular near–uniform (Dirichlet) priors for learning undersampled probability...
Inferring the value of a property of a large stochastic system is a difficult task when the number o...
Each parameter ` in an abstract parameter space \Theta is associated with a different probability di...
The mutual information of two random variables ı and with joint probabilities {πij} is commonly us...
Prior specification for nonparametric Bayesian inference involves the difficult task of quan-tifying...
We consider the problem of estimating Shannon’s entropy H from discrete data, in cases where the num...
We consider the problem of estimating Shannon's entropy H from discrete data, in cases where the num...
e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probab...
Given a random sample from a distribution with density function that de-pends on an unknown paramete...
This note investigates possible extensions of Fisher's measure of information to the case where...
The estimation of probability densities from data is widely used as an intermediate step in the esti...
We consider the analysis of probability distributions through their associated covariance operators ...
Given the joint chances of a pair of random variables one can compute quantities of interest, like t...
Given the joint chances of a pair of random variables one can compute quantities of interest, like t...
In a given problem, the Bayesian statistical paradigm requires the specification of a prior distribu...
We study properties of popular near–uniform (Dirichlet) priors for learning undersampled probability...
Inferring the value of a property of a large stochastic system is a difficult task when the number o...
Each parameter ` in an abstract parameter space \Theta is associated with a different probability di...
The mutual information of two random variables ı and with joint probabilities {πij} is commonly us...
Prior specification for nonparametric Bayesian inference involves the difficult task of quan-tifying...
We consider the problem of estimating Shannon’s entropy H from discrete data, in cases where the num...
We consider the problem of estimating Shannon's entropy H from discrete data, in cases where the num...
e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probab...
Given a random sample from a distribution with density function that de-pends on an unknown paramete...
This note investigates possible extensions of Fisher's measure of information to the case where...
The estimation of probability densities from data is widely used as an intermediate step in the esti...
We consider the analysis of probability distributions through their associated covariance operators ...
Given the joint chances of a pair of random variables one can compute quantities of interest, like t...
Given the joint chances of a pair of random variables one can compute quantities of interest, like t...