summary:This paper deals with Bayesian models given by statistical experiments and standard loss functions. Bayes probability of error and Bayes risk are estimated by means of classical and generalized information criteria applicable to the experiment. The accuracy of the estimation is studied. Among the information criteria studied in the paper is the class of posterior power entropies which include the Shannon entropy as special case for the power $\alpha =1$. It is shown that the most accurate estimate is in this class achieved by the quadratic posterior entropy of the power $\alpha =2$. The paper introduces and studies also a new class of alternative power entropies which in general estimate the Bayes errors and risk more tightly than t...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
summary:This paper deals with Bayesian models given by statistical experiments and standard loss fun...
The measure of entropy has an undeniable pivotal role in the field of information theory. This artic...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
Bayesian analysts use a formal model, Bayes’ theorem to learn from their data in contrast to non-Bay...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Gaussian theory of errors has been generalized to situations, where the Gaussian distribution an...
The widely applicable Bayesian information criterion (WBIC) is a simple and fast approximation to th...
A parametric family of lower bounds of the Bayes risk is derived in terms of an equivocation measure...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
summary:This paper deals with Bayesian models given by statistical experiments and standard loss fun...
The measure of entropy has an undeniable pivotal role in the field of information theory. This artic...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
It is well known that the classical Bayesian posterior arises naturally as the unique solution of di...
Bayesian analysts use a formal model, Bayes’ theorem to learn from their data in contrast to non-Bay...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Gaussian theory of errors has been generalized to situations, where the Gaussian distribution an...
The widely applicable Bayesian information criterion (WBIC) is a simple and fast approximation to th...
A parametric family of lower bounds of the Bayes risk is derived in terms of an equivocation measure...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...
The Principle of Maximum Entropy is often used to update probabilities due to evidence instead of pe...