Bayesian inference under imprecise prior information is studied: the starting point is a precise strategy σ and a full B-conditional prior belief function BelB, conveying ambiguity in probabilistic prior information. In finite spaces, we give a closed form expression for the lower envelope P of the class of full conditional probabilities dominating BelB, σ and, in particular, for the related “posterior probabilities”. The assessment BelB, σ is a coherent lower conditional probability in the sense of Williams and the characterized lower envelope P coincides with its natural extension
This paper advocates the use of nonpurely probabilistic approaches to higher-order uncertainty. One ...
This paper considers learning when the distinction between risk and ambigu-ity (Knightian uncertaint...
Central in Bayesian statistics is Bayes theorem which can be written as follows jx fxj Given th...
Starting from a likelihood function and a prior information represented by a belief function, a clos...
We provide (in a finite setting) a closed form expression for the lower envelope of the set of all ...
The aim is to provide a characterization of full conditional measures on a finite Boolean algebra, o...
Some of the information we receive comes to us in an explicitly conditional form. It is an open ques...
Introduction Central in Bayesian statistics is Bayes' theorem, which can be written as follows...
To perform Bayesian analysis of a partially identified structural model, two distinct approaches exi...
In inference about set-identified parameters, it is known that the Bayesian probability state- ments...
We solve two fundamental problems of probabilistic reasoning: given finitely many conditional probab...
A great advantage of imprecise probability models over models based on precise, traditional probabil...
Bayesian inference enables combination of observations with prior knowledge in the reasoning process...
Learning from model diagnostics that a prior distribution must be replaced by one that conflicts les...
It is a commonplace in epistemology that credences should equal known chances. It is less clear, how...
This paper advocates the use of nonpurely probabilistic approaches to higher-order uncertainty. One ...
This paper considers learning when the distinction between risk and ambigu-ity (Knightian uncertaint...
Central in Bayesian statistics is Bayes theorem which can be written as follows jx fxj Given th...
Starting from a likelihood function and a prior information represented by a belief function, a clos...
We provide (in a finite setting) a closed form expression for the lower envelope of the set of all ...
The aim is to provide a characterization of full conditional measures on a finite Boolean algebra, o...
Some of the information we receive comes to us in an explicitly conditional form. It is an open ques...
Introduction Central in Bayesian statistics is Bayes' theorem, which can be written as follows...
To perform Bayesian analysis of a partially identified structural model, two distinct approaches exi...
In inference about set-identified parameters, it is known that the Bayesian probability state- ments...
We solve two fundamental problems of probabilistic reasoning: given finitely many conditional probab...
A great advantage of imprecise probability models over models based on precise, traditional probabil...
Bayesian inference enables combination of observations with prior knowledge in the reasoning process...
Learning from model diagnostics that a prior distribution must be replaced by one that conflicts les...
It is a commonplace in epistemology that credences should equal known chances. It is less clear, how...
This paper advocates the use of nonpurely probabilistic approaches to higher-order uncertainty. One ...
This paper considers learning when the distinction between risk and ambigu-ity (Knightian uncertaint...
Central in Bayesian statistics is Bayes theorem which can be written as follows jx fxj Given th...