We take another look at the general problem of selecting a preferred probability measure among those that comply with some given constraints. The dominant role that entropy maximization has obtained in this context is questioned by arguing that the minimum information principle on which it is based could be supplanted by an at least as plausible ``likelihood of evidence'' principle. We then review a method for turning given selection functions into representation independent variants, and discuss the tradeoffs involved in this transformation
We examine the task of feature selection, which is a method of forming simplified descriptions of co...
Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabil...
When we only have partial information about the probability distri-bution, i.e., when several differ...
We take another look at the general problem of selecting a preferred probability measure among those...
The aim of this paper is to introduce a use of both the principle of minimal specificity (mS) and ma...
Traditionally, the Maximum Entropy technique is used to select a probability distribution in situati...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
In many practical situations, we only have partial information about the probabilities; this means t...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...
Optimal measurement selection for inference is combinatori-ally complex and intractable for large sc...
AbstractWe introduce a set of transformations on the set of all probability distributions over a fin...
Objective Bayesians hold that degrees of belief ought to be chosen in the set of probability functio...
The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) an...
Udgivelsesdato: MARWe introduce a set of transformations on the set of all probability distributions...
We introduce a set of transformations on the set of all probability distributions over a finite stat...
We examine the task of feature selection, which is a method of forming simplified descriptions of co...
Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabil...
When we only have partial information about the probability distri-bution, i.e., when several differ...
We take another look at the general problem of selecting a preferred probability measure among those...
The aim of this paper is to introduce a use of both the principle of minimal specificity (mS) and ma...
Traditionally, the Maximum Entropy technique is used to select a probability distribution in situati...
This paper is a review of a particular approach to the method of maximum entropy as a general framew...
In many practical situations, we only have partial information about the probabilities; this means t...
We review a decision theoretic, i.e., utility-based, motivation for entropy and Kullback-Leibler rel...
Optimal measurement selection for inference is combinatori-ally complex and intractable for large sc...
AbstractWe introduce a set of transformations on the set of all probability distributions over a fin...
Objective Bayesians hold that degrees of belief ought to be chosen in the set of probability functio...
The most widely used forms of model selection criteria, the Bayesian Information Criterion (BIC) an...
Udgivelsesdato: MARWe introduce a set of transformations on the set of all probability distributions...
We introduce a set of transformations on the set of all probability distributions over a finite stat...
We examine the task of feature selection, which is a method of forming simplified descriptions of co...
Objective Bayesian epistemology invokes three norms: the strengths of our beliefs should be probabil...
When we only have partial information about the probability distri-bution, i.e., when several differ...